I have a need to parse the contents of an XML file (example: file.xml) that contains a randomly generated Hardware ID number (among other things). Format of the string I want to look at inside this file looks like this: HardwareID ID=“Q00B10BC3FA4FD5A79E22923F2EF6F5F”. If two machines have this same Hardware ID, I want to delete the entire XML file completely. Also if that happens I want to delete the data of a registry key. In this case it looks something like HKLM\Software\AppName\AppName2\XXX\Name\Name\ Type:REG_SZ Name: HardwareID Data:
The Data field contains that same Hardware ID in the xml file. I want to blank it out if two machines have the same HardwareID.
Is this possible with Bigfix. I assume to get this to work Bigfix would have to parse and write the value of the Hardware ID from the XML file to some centralized file on a share and if that value is already in this file, then the machine that tried to write this value would have a fixlet to delete the XML file and blank out the data of the registry above (just the data, not the whole key).
It kind of depends if you want to do this once (or just every once in a while) or on a regular basis. It might be worth going to the root cause and figure out how/why you’re getting duplicates and try to avoid that in the first place. (Keep a list of supplied ID’s and ensure new ones are unique?).
Adhoc
-Create analysis to pull the ID’s.
-Copy output of list into Excel.
-Sort and find duplicate ID’s.
-Take list of ID’s and add as parameter to task or edit task and add it to appropriate action check and/or relevance.
-Run task to remove files/registry key for any machines with matching ID’s to those of the supplied list.
Automated
-Create task to output computer ID to file and use upload manager to send to bes server.
You’ll then need to create a script which will:
-Read/parse the files that were uploaded and extract ID’s provided.
-Find duplicates.
-Run action regenerator and modify action/task with updated list of duplicates.
I would like to make it automatic. We have a piece of software we deployed that reports to a central server (simular to bigfix). Each client is supposed to have a unique ID. However, one of my techs put the agent on the system that we use to make images for other systems. Now a bunch of machines have the same hardware ID and overwrite each other’s logs on the central server. I want to correct this without having to manuallly touch each machine. Another option is to just nuke the xml file on each machine we own (one time action for all systems) and clear the registry data field to blank. Then the clients will automatically regenerate one by themselves as opposed to just fixing one at a time manually or with a complicated fixlet. I’m a bigfix newbie so both methods are beyond me at this point. (Registry one I could do, deleting a file I’m not sure)
It’s hard to get clients to correlate information from other clients. You could pretty easily do a one time action to target clients with duplicates known on the server, but it will be hard to put an ongoing policy in place that will act whenever a new duplicate occurs.
How awful would it be to just nuke the xml file one time on each machine? That would be super easy:
I think it would be easier to just delete the xml and reg key one time on every computer as the clients will automatically generate a new unique one. So if I use:
Relevance:
exists file “foo.xml” of folder “bar”
Action:
delete {file “foo.xml” of folder “bar”}
regdelete “HKLM\Software\Bar\Foo”
Does the regdelete just remove the data of the Foo entry or is that deleting the whole key? (I just want to remove the data and leave the reg key there.)