One of our admins has a task with relevance to see if a file on his workstations is more than 24 hours old. If the file is more than 24 hours old, it is supposed to download it from a HTTP repository and update it. To get around having to change the SHA1 every time the file changes, he is using the “add nohash prefetch item” action. However, it appears that the BigFix server has just cached the file once and is continuing to send out the same file even though the source file in the HTTP repository has changed. Is there a way to tell it to only cache the file for a certain period of time in the action? Or are we going about this the wrong way and perhaps we should be using a different technique?
((name of operating system as lowercase starts with
"win") AND (computer name as lowercase starts with
"pos") AND ((exists file
"c:\program files\software_package\some_file.dat
") whose ((date (local time zone) of modification time of file
"c:\program files\software_package\some_file.dat") < (date (local time zone) of now))) AND (exists file
"c:\program files\software_package\some_file.idx
") whose ((date (local time zone) of modification time of file
"c:\program files\software_package\some_file.idx") < (date (local time zone) of now))) OR ((name of operating system as lowercase starts with
"win") AND (computer name as lowercase starts with
"opt") AND (exists file
"c:\software_package\some_file.dat") whose ((date (local time zone) of modification time of file
"c:\software_package\some_file.dat") < (date (local time zone) of now)) AND (exists file
"c:\software_package\some_file.idx") whose ((date (local time zone) of modification time of file
"c:\software_package\some_file.idx") < (date (local time zone) of now))) OR ((name of operating system as lowercase starts with
"lin") AND (computer name as lowercase starts with
"sc") AND ((exists file
"/home/directory/directory/some_file.dat") whose ((date (local time zone) of modification time of file
"/home/directory/directory/some_file.dat") < (date (local time zone) of now))) )
Here’s a sanitized version of my action script
begin prefetch block add nohash prefetch item url=http:
//my.server.domain/software_package/some_file.dat add nohash prefetch item url=http:
//my.server.domain/software_package/some_file.idx end prefetch block
if
{computer name as lowercase starts with
"pos"
} delete
"c:\program files\software_package\some_file.dat" copy
"__Download\some_file.dat"
"c:\program files\software_package\some_file.dat" delete
"c:\program files\software_package\some_file.idx" copy
"__Download\some_file.idx"
"c:\program files\software_package\some_file.idx" elseif
{computer name as lowercase starts with
"opt"
} delete
"c:\software_package\some_file.dat" copy
"__Download\some_file.dat"
"c:\software_package\some_file.dat" delete
"c:\software_package\some_file.idx" copy
"__Download\some_file.idx"
"c:\software_package\some_file.idx" elseif
{computer name as lowercase starts with
"sc"
} delete
"/home/ccl/ccl/some_file.dat" copy
"__Download/some_file.dat"
"/home/ccl/ccl/some_file.dat" run chown ccl.users
"/home/ccl/ccl/some_file.dat" run chmod ug+wx
"/home/ccl/ccl/some_file.dat" delete
"/home/ccl/ccl/some_file.idx" copy
"__Download/some_file.idx"
"/home/ccl/ccl/some_file.idx" run chown ccl.users
"/home/ccl/ccl/some_file.idx" run chmod ug+wx
"/home/ccl/ccl/some_file.idx" endif
Unfortunately, the “nohash prefetch” will only be cached once… If you could somehow look up the sha1 in relevance somehow, this would work… If not, you might need to reissue the action periodically… Perhaps the action regenerator will help:
An alternative way that does work uses the older “download now as” method.
We use it successfully to now only download dynamically changing content, but also to do relevance substitution in the URL.
In this example, a plu file is generated on a scheduled basis on the back end. It is picked up and compressed with the BFarchive tool and placed into a folder structure based on store number on a home office web server. We use the following URL without hash checking to download the plu file on an agent at the store. The agent can natively extract the contents since it was compressed with BFarchive.
hi, It seems that a lot of the potential benefits from the download functionality are negated by mandating the need for the sha1 value in the download url within the prefetch block. If the files that are to be downloaded are ‘dynamic’ in nature then the sha1 and size are also likely to be dynamic.
Have I misunderstood? I hope so and that someone will post an easy way to dynamically read sha1 from a master file or something without having to download it each time first and then parse for values.
Just an update, I have created a copy of my original fixlet using the
“download now as”
instead of
“add nohash prefetch item”
and it seems to be working, the only issue I had was having ~1400 boxes trying to hit my web server in very close proximity and about half of them were failing. I have since adjusted the number of servers I’m running in apache and used some staggering and retry options in the fixlet / policy I’m using and now it seems to be succeeding for the most part. I just sent out the policy and will continue to monitor. Lets hope the
“download now as”
doesn’t go away in the next release of BigFix or some viable alternative is provided to allow sending out dynamic files on a daily basis.
While John did find a work around (we work for the same company), it is far from ideal. It would be nice if we could somehow prefetch the file without a sha1 and set a TTL on the caching. We have an extensive relay infrastructure in place that this fixlet is entirely circumventing.
prior to the introduction of 7.2 and the dynamic download fuctionality, I have created this kind of fixlet using the download now command and relevance substitution … then spliting the fixlet into a client and relay part so clients still use the relays. Its fairly complicated and was the only neccessary because of over 12k clients that were targeted so it would not be practical to hoit a single http server.
I was hoping that the prefetch block would solve the problem, which it almost does … there is mention of a command to execute a plugin of sorts that could be of use if we could get some examples of how this might work if it could generate sha1 values.
The reason we require a sha1 is to ensure the integrity of the files distributed to your endpoints. It is very easy for an attacker to spoof urls and get the client to download an unintended file. In most cases the requested download is then executed at the end point. It is virtually always a big mistake to download and execute content without verifying that the downloaded file is in fact the requested file. Therefore we have removed support for this type of download when using the newer prefetch syntax. ‘Download now as’ is a legacy command that for historical reasons made it into the action language, but you can probably expect it to go away at some point.
You can still achieve dynamic download behavior while verifying the integrity of the download. Depending on your requirements, there are a couple possible ways. One example to look at is pattern definition delivery in the CPM product. The way it works is:
A component on the BigFix server downloads the latest updates from Trend. This component verifies the authenticity of the components at this time.
A signed file containing meta data about the available files, including their sha1, is published and distributed to BigFix endpoints.
A download plugin on the endpoints consumes the meta data and determines if any of the available files are required.
If a file is required, a download request is submitted up the BigFix chain based on the published sha1.
At all points in the chain, the integrity of the file is verified. You can implement similar custom processes in your own environments to allow downloading of dynamic content using a single action. It just requires investing a little more effort in the process to ensure the security of the architecture.
Please communicate to Bigfix customers well in advance of “download now as” being eliminated. We have important processes that rely on that command.
We understand that this approach isn’t ideal, but it is the only workable alternative readily available. Bigfix is assuming that the downloaded file will be executed, so it is attempting to protect against rogue content. Generally, that makes good sense. However, there should be manual, non-default parameters that can by-pass those checks.
Whatever alternatives are created, we need some mechanism to be able to do relevance substitution in the download url and, since we don’t necessarily know the sha1 of a dynamically generated file, a hashless way to do the download.
Consider this example: We have an ERP system that dynamically builds globally unique configuration information and drops it into a folder structure based on location name or number on a web server. Thousands of machines in the field download the most current file that is personalized just for them by going to the appropriate folder on the web server based on location number via relevance substitution in the download url. This works splendidly for getting the proper dynamic information to the correct machine in the field.
Another case: There are automated business processes by with vendors exchange information with us dynamically in an agreed upon cadence and convention. As the files are dropped into a folder structure centrally, machines in the field running policy actions periodically download the proper file based on its location code and insert that into its local process. Again hashless “download now as” with relevance substitution are key to making this work.
To come at this from a different angle, assumptions and controls that are desirable around a traditional patch or deployment process in Bigfix are not necessarily the same when Bigfix is used as that a business process vehicle. To us, Bigfix is more than a security management or deployment product, it provides a mechanism for globally unique and “just in time” business processes.
If Bigfix continues with the hashing requirement and the phase-out of “download now as”, then I strongly urge the creation of a utility (available as part of the base product, not just CPM) that could be used with any business process that produces unique files. Perhaps a configurable “file-watcher” type utility that could automatically hash incoming files and dynamically publish the hash list to a Bigfix “whitelist”. Bonus points for including an option to automatically compress the files as well.
Even in the case where you are not distributing executables, there is always a risk that the file is maliciously crafted such that when it is used it takes advantage of a vulnerability. There are numerous examples of exploits that travel via infected PDF, JPG, or other non-executable file types. For this reason, we consider it vital to verify all content the BigFix client places on an endpoint.
When you have a system such as BigFix that is able to rapidly distribute content to an entire organization the risk of a potential problem is amplified dramatically. There are a lot of ways to shoot yourself in the foot with BigFix, but we do our best to avoid making it easy. Requiring a verification mechanism for all downloads is one way we do so.
If you do want to have your clients download dynamic content, that is possible with the tools available today. For example, you can use the PropagateFiles utility to place a manifest with the sha1s and urls of the latest available files in a custom site. Your action can then query the manifest and use relevance substitution to request the file knowing what its sha1 should be. It would be up to the process you put it place to verify the integrity of the manifest prior to pushing it to the clients.
It’s true that the secure solution is rarely the easy solution, but it’s really not bad in this particular case and I believe it is worth the trade off against the risk of compromising your entire organization.
You guys have entered an ongoing internal debate that has raged at BigFix for several years now…
I am in favor of this command for the reasons that you guys mentioned, but we haven’t been able to agree internally about how to handle the security of such a design and we would hate to open a security hole…
All, you might try this accomplish the “re-occurring download” functionality:
Create a custom Fixlet using the “download” command (no sha1) to download your file… This will use the relays to download the file, but it will only serve one file (it won’t re-download and re-cache the files if they change on the source server).
To get the re-downloading to work, use the Action Regenerator (http://support.bigfix.com/bes/misc/actionregenerator.html) and modify the configuration options to point to the Fixlet ID / site that you created. Use a Windows scheduled task (or whatever mechanism) to run this periodically.
This should give you the benefits that you need without too much hassle… You should consider making a separate BigFix user just to run this utility and you should secure the private key/password with locked down Windows permissions.
If you want to take this to the next level, the BigFix API will allow you to automate almost anything and our services team (http://support.bigfix.com/services/) or you can build it yourselves.
just an idea, but if a native command were allowed in the prefetch block that could parse a text file from a URL containing sha values etc that might do th trick. What if we created a executable that could read from a http url and used that as the plugin, would that work in theory?
Thanks for your perspective. I heartily agree that having verified content is definitely preferable. However, having read the documentation for Action Regenerator (Ben’s suggestion) and Propagate files (Jesse’s suggestion), there are still some gaps.
Both of these approaches appear to assume moderate frequency change of a file (say a weekly AV def) which is applicable to the organization at large and where the hash of the file is known in advance of deployment.
Contrast that with a dynamic frequency of change of unique files with unknown unique hashes generated just-in-time from an ERP system (it may be zero to dozens of files per day on demand per business conditions) that are pulled by select endpoints where each unique machine gets its own globally unique file.
The current process, “Download now as” with relevance substitution in the url to a folder structure based on name or location, allows the unique endpoint to insert the correct globally unique file just-in-time during an automated business process flow. This approach currently works well. If I attempted Action Regenerator, it could hash, package, and deploy an action, but what would be its target? When? In the current model, the unique endpoint knows what it is supposed to get and the timing with which it is supposed to be received based on relevance which is, in turn, based on business logic. Under Action Regenerator, how does it know that a table of SKUs, for example, belong to a given unique endpoint and then target that endpoint at the correct time in the endpoints “work flow”?
It is a “generic push” versus a “specific unique pull” discussion. Does that make sense?
@khanand: Under version 7.2 and higher you can use relevance substitution in the ‘add prefetch item’ command to read from a local text file. You could use a plugin that downloads a file and reads sha1’s from it and writes the output to a file, which is then parsed by the ‘add prefetch item’. The key would be to insure the integrity of the downloaded file. You’d probably want to sign it and have the plugin verify the signature.
@JonL: The frequency of AV updates is approximately once an hour these days. That is our recommended interval when checking for updates under CPM. To accommodate your ERP system you would just need to add appropriate business process to update the list of available files when they change and publish the list through BigFix. It would be a “generic push” of the available files manifest, and then each client would request a “specific unique pull”. You would in fact be doing both.
We don’t use CPM, so I’m not familiar with the manifest process. I’m sure there are docs on that somewhere.
So lets say that manifests of ERP files are published into Bigfix. The clients in the field see a list of known good files, but not necessarily which specific file is applicable to them. Since the known good files are not universally applicable, like an AV def, the missing link is the association so that a given endpoint knows which one of the manifested files are correct. With “Download now as”, this is handily addressed with relevance substitution in the url. Per the WinActions guide, “Relevance substitution is NOT performed on the prefetch action command lines.”
How would you recommend an automated manifest solution such that a given endpoint knows which, if any, of the hundreds of manifested files are appropriate for that globally unique endpoint?
Sorry for not responding. I took a couple of days off. Starting with version 7.2, the ‘add prefetch item’ command allows relevance substitution. I’m not sure if the docs have been updated to reflect that.
The current version of WinActions pdf on the site is from 11/10/08 (rather old). It explicitly says at the top of page 18 “Relevance substitution is NOT performed on the prefetch action command lines.”
Apparently that has changed. Does Bigfix have any plans to update the docs? Is there anywhere online where the “add prefetch item” syntax is detailed?
Also, can you point me to a doc on the manifest process?
JonL … thanks for the pointer to this discussion, this is what I am looking for, basically.
I am in a similar position as JonL … I have the code written for doing the manifest file and dynamically downloading a file using relevance substitution BUT … I do not want to push a new manifest file to every client just becuase it changed and I expect it to change on a regualr basis … at least once a day if not more.
SO … I’m back to the needing to download a file without requiring a sha1 and size and not having it cached so that I get the updated version of the file and then I will parse out the sha1, size and everything else I need to download the file I need.
Isn’t there some way that the file could be encoded or signed by BigFix to verify its integritry so that it could then be safely downloaded to a client without requiring the checks? It would seem if there was some way to “register” a file with the BigFix server that this would suffice as proof … the sha1 and size could then be passed within BigFix without the need of the client knowing it.