AirGapped Network. Any way to point BigFix for patches to Microsoft SCCM as a MS content repository?

I have an airgaped system that currently has a very large SCCM implementation.

Nearly impossible to download all the microsoft content on a periodic basis when they are already pulling in Microsoft Content into the SCCM environment.

Is there anyway with Windows patching, to point to SCCM as the content source, much like we do with Redhat Patching?


The short answer is no.

The long term answer is file an RFE to request this be added in the future, though I wouldn’t expect it anytime soon. How to ask for IBM product help: PMRs, RFEs, and more

The long answer is that you could technically change the hosts file on the root server to cause it to look at an internal web server to your organization instead of Microsoft’s download site. Then you would have to populate this web server with all of the microsoft downloads you wish to use in the exact path that they are normally located on Microsoft’s site and in the prefetch commands of the BigFix patches. It might be possible to automate this somehow with the REST API, but it wouldn’t be easy.

Ideally you would use a proxy server to allow the BigFix root server to talk to Microsoft’s site and Microsoft’s site only. This does remove some of the AirGap, but I think in a reasonable and controlled way that doesn’t open up any potential attacks on the root server.

There is another method that might work. See Step # 3 in this article

This gives you a way to download all of the files your fixlets would need using the BESDownloadCacher.exe

Your would need to use the masthead file for the site you need to pull downloads for (in this case it would be EnterpriseSecurity.exfm.

Once you download all the files, they should be named their sha1 values.

You could then take them over to your airgapped BigFix server via USB sneaker net and drop them in the sha1 download directory.

Note: you’ll need to increase the size of this cache via a client setting which could be deployed via a fixlet see:

You’ll need to increase it to something large enough to accommodate the entirety of the downloads plus some so that files needed don’t fall off the cache during LRU churn. (lots of gigabytes needed here)

Then all the download files you need will be staged and ready to go.

Let me know if you try this and run into any problems so I can validate the the advice given here.

I realize this doesn’t address the need for not having to download everything from month to month. Just some ideas that might be incorporated into what you are trying to do.

1 Like

You could also copy all the files from the SCCM repo and put them in the cache on the BigFix server, similar to the previous suggestion, but you won’t need to download the files again. You will need to rename all the files to their SHA values, but you should be able to script that fairly easily.

1 Like

Coping the to the BigFix Server from SCCM as their SHA value is an excellent idea. Is there any sample code out there in the BIgFix community that we could leverage for this?

The environment will have a mix of windows clients that are directly managed by SCCM for patching, and some that cannot join a windows domain, or have their own windows domain.

Thanks much!

Haven’t seen any scripts for this, but robocopy is a good utility for mirroring folders between two servers. And we have a tool that can calculate the sha1 here:!/wiki/Tivoli%20Endpoint%20Manager/page/SHA1%20Tool

A few years ago I worked at a customer with a very robust airgap. Luckily they had an existing process for importing all the patches from MS and a few Linux distros so they setup a Web server on their repo that we could access. We added the Bigfix updates to their import process.
I wrote a new Download plugin in Python to replace the out of the box HTTP service so whenever a HTTP request was made it went through a sequence of searching a list of web repos till it found the file needed.
I also wrote a utility to download Site content ready to be brought in through the airgap.

Sadly I can’t share this publicly as it’s IBM property but it could be resurrected as part of a Services engagement.