We’ve been using BigFix for about seven years now and over that time operators have created a fair number of analyses and custom properties. In reviewing these, I’m seeing duplication of info we’re collecting from computers and would like to implement some guidelines/process for operators to follow when creating a new analysis or property.
I’d like to learn how other orgs are managing operator creation of analyses/properties given the potential impact these can have on agent/computer performance. Do you have a process you follow or do operators create and collect whatever they want? Do you review collected properties? How? How often?
Also, are there some best practices around the number/complexity of collected properties? We know that global file searching is a bad idea in property relevance for performance/impact. Any other suggestions?
One thing that I am trying to implement is a list of every analysis, what is being collected, why, and the possible answers (if there are specific answers).
This is not done yet, so I can’t say if it works or not. But I mention it as it is also something we are dealing with.
Ideally operators are making analyses in their operator sites or in custom sites with limited applicability so their analyses only really effect the devices they are in charge of.
Avoid making analyses in the master action site
Check for “Slow Evaluations” (relevant below). Get rid of very slow evaluations, reduce reporting frequency from every report to every x hours or days depending on the importance of timely data from that analysis
We have some operators who are scoped to manage a subset of computers, but the majority and our most active operators manage all computers on our campus. I’ll check the “slow evaluations” - thanks.