How to use the Airgap tool to update site content and download patch updates? BigFix 11.0.0

I am tasked with administering patches using BigFix in an AirGapped environment, so in the last week or two I have been giving myself a crash course in the BESAirGapTool. I have a few questions regarding this tool to clear up my understanding of it and put together a good patching strategy.

Question 1: Should the internet facing “Site Gathering” computer remain a singular computer each time the patching process is performed? I ask this because I am noticing the “Airgap.db” and “AirgapCache.db” files that reside in the airgaptool’s directory. Do these DB files track the Content/download file history of the SiteGathering computer?

Say airgap tool moved to a different computer next patching cycle because maybe it was performed by a different operator, it’s then going to download the whole SiteContent history instead of just what’s new from the last time the AG tool was run before, correct? I believe this is the case in my environment. In order to get things up to date would it be recommended to do an initial airgap request from the BigFix server ?

Question 2: When you select the option “A” for your sites to gather site content and download fixlet update files, you then move the airgapresponse file to the server and ingest it to update site content. How do you get the downloaded update files moved over to the server as well?

Question 3: Is there a reason that you would need to separate the “Site Gathering” computer that gathers site content and the “Patch Download” computer into two different machines?

Any other general advice tips and tricks would be appreciated to get me up to speed on this process.

Thanks in Advance!

I’m working with Lots of Airgap Clients!

Question 1: The AirgapResponse file will contain the Sites content “Metadata” (Fixlets etc…) - by default, the Sites List will be configured with the A option which will just download the Sites that had changed from the last Airgap process (all of the processes are saved on the DB files) - I suggest to change it to R option - so everytime you will have the Full Site content “Metadata”.
If you are creating a new installation, you can just create an initial AirgapResponse with the default file list, import it , enable on the Console all of the relevant sites, enable the same sites on the file list and download the “Full” airgap response and import it again.

Question 2: On the Internet facing machine, when you are using the download option, it will first create an AirgapResponse file and then will download the files to the Cache folder - it is written on the documentation

Question 3: I’m using the same machine. Just remember that you will need enough disk space.

Thanks Orbiton, what documentation are you referring to exactly - there seems to be all kinds of outdated documentation out there that is making it a bit confusing for me to know what to follow.

https://help.hcltechsw.com/bigfix/10.0/platform/Platform/Config/c_airgap_tool_NonExtr.html

Thanks I think I am finally getting the hang of it.

Once the files get moved from the cache directory on the internet facing machine, where do they go then? Do they get ingested in some way or do they live in the cache directory until you manually archive them? Surely they’d remain available for future use?