Automated software distribution file update Question

I’ve been working on an initiative to ‘Tag’ servers with attributes that identify which Application Portfolio program that the servers belong to. This will help us to highlight the criticality of the server, isolate and prioritize remediations, etc. I have a task that will parse out the server specific data from a json file and create attributes on the server that we can report on in webreports or in the console but i’m trying to remove a major step from the update process. The team that updates our application fortfolio data has a file that they update that needs to get imported into software distribution and parsed through the task which tags the machine.

Is there any way to automate the import process to get the file into software distribution and update the task with the updated prefetch information so that it doesn’t have to be a manual process to import the file, update the task, then deploy the task?

I originally tested the process with copying the file into the configs folder under wwwroot and used downloadnow but many of these servers are in isolated networks that don’t have line of sight to the root server so a public url download or the root server url download path won’t work. Any suggestions would be greatly appreciated.

This sounds like a good use-case for a variation on Updating custom tasks

As long as the file is fairly small, and it’s acceptable for every machine to get a copy of the file, you could attach the file as a Site File with the “Send to Clients” option.

The clients receive the file as part of their site gather, and you can code your Action to act based on the contents of the file. That way you don’t need to create a new SWD package, update the Tasks, or re-issue the Action.

You can automate the Site File upload with the REST API if necessary.

1 Like

More related

That sounds like a great idea. let me go look investigate importing the file into the site from RestAPI

Here’s another potentially useful reference as it relates to secure dynamic downloads via manifests: https://developer.bigfix.com/action-script/guide/dynamic_download.html

1 Like

@JasonWalker @Aram Thank you both for your suggestions. I believe the dynamic downloads and using RestAPI to update the file in the site will do what I need.

As Always Thank you guys for the quick assistance.

I wish that we were allowed to mark more than one solution in a post because both of the suggestions from Aram and Jason helped. I was able to get this setup and it works beautifully. Basically what I did is setup a process so that when the master tag file (xlsx file that has server information from all servers) gets placed/updated in a share, one fixlet policy action will run that processes that file and converts it to json to make it easier for BigFix to read and parse. That json file gets updated in the webserver config directory and the task creates a download.spec file with the file name, sha1, sha256,size of the file, and download path and uploads it to the site. Once that file gets propagated to the clients, the second policy action then becomes relevant because the hash in the download.spec file doesn’t match the hash in a client setting that I created. The second task runs to tag all of the servers with the information relevant for the server and updates the client setting with the current hash. I always love when you find something new in BigFix that you weren’t aware of and it opens a whole new world of possibilities to do with it.

4 Likes