ENDED: BigFix Sessions & Global User Group Meeting @ IBM Think 2018

I’m really sorry for the delay - approvals have been obtained and I’ll upload the code to a public-facing repo today and post a link. Your understanding is generally correct on all points. We only update daily, because we have no agent/driver that can update the SQLite on file writes, which would push this into the commercial EDR space.

BigFix deploys a Powershell script daily to subscribed endpoints. The script does the following:

  1. If an existing SQLite file exists, reads its hashes table into memory.
  2. Traverses the drive looking for “Interesting” file types. For each file found, if it’s new, or if the size or modification timestamp have changed since we last saw it (compare with structure in memory), hash the file.
  3. Commit the current in-memory structure to a SQLite table.

Once the SQLite files exist, BigFix analyses can query them easily with the sqlite inspector. We can equivalently do it ad-hoc via the Web UI.

Before implementing this yourself, you should ask yourself if daily hash updates are sufficient, or if you really need a continuously-updating EDR suite. This is a stopgap, a cheaper approach that doesn’t require the installation of another agent on the endpoints. Would I rather have something like Carbon Black? Sure. But that’s an additional big expense and another agent that may not be something you can deploy broadly.

Additionally, you should test extensively - I’ve pushed this for a week or so to a small test pool of machines and nobody noticed, but depending on your users and their usage patterns, the scanning I/O load may be impactful.

2 Likes