ENDED: BigFix Sessions & Global User Group Meeting @ IBM Think 2018

The BigFix Open Mic Tech Talk that will be at 10:30 a.m. Tuesday, March 20, will also be broadcast live from the IBM Security Learning Academy so people who are not at Think 2018 can attend remotely.
https://www.securitylearningacademy.com/static/openmic.html

Details about the topic, “Using the new AirGap ‘Non Extraction’ method to gather sites and content from BigFix V9.5.5,” are at http://www.ibm.com/support/docview.wss?uid=swg27050828

1 Like

I’ve seem no information about it be streamed somehow, so I am assuming it will not be. Am I wrong? Is there a chance to join it online?

We have not done an online stream for any of the bigfix user groups. This is highly requested, but we don’t have a great way to do it, plus many speakers would have to all okay being recorded and/or broadcast, which often many specifically cannot.

It might be more possible to do a live broadcast without recording it, but even that will require some effort to arrange with all of the speakers.

I finished creating a google calendar for the BigFix stuff at IBM Think 2018. Some of the unofficial get together time slots I have marked don’t have a location yet, but that might be something figured out on the fly in Slack.

Just got my ticket! Can’t wait!

1 Like

Hey James, any chance the official sessions happening at Mandalay Bay South Level 1 and Level 2 to have online stream?

2 Likes

Catch the livestream of the Chairman’s Address, keynotes, and Innovation Talks from any device at ibm.com/think2018.

1 Like

Presentation on how to get the most out of the conference: https://docs.google.com/presentation/d/1W2bqDLFrQewW63Lsq7s66UUzJEqaHnRLW0FVoHQ1O8o/edit?usp=sharing

apparently my “Meet the experts” session was live streamed

1 Like

Hey, does anyone know if the Stanford guys are going to be putting any reference material up for their “Watchers on the Wall” talk? Their work definitely got folks at my company excited about doing the kind of hash data collection they are doing, and we’d like to get a little more detail on what they did.

1 Like

I agree.

CC: @sbl @jtavan

1 Like

I certainly hope to make the code available. Currently going through getting approval to publish code publicly and will let you know when it’s up.

2 Likes

Ahead of the approvals, can you provide more details on the design and implementation. I tried to take some notes during your talk, but I didn’t get as much down about it as I would have liked. After the talk, my colleague and I compared notes and basically understood the following:

  • There was some kind of scripting or agent running on the endpoints that collected file data and calculated hash values. The collected information was stored locally on a SQLite database.
  • There is some intelligence in the script/agent to basically do “deltas” after the initial data collection. I wasn’t sure if this was then a once daily operation or if you have something more embedded that updates the SQLite data after any file system create or modify event.
  • The WebUI Query pulls information directly from the endpoint’s SQLite database using Relevance that works specifically with SQLite databases.
  • The data collection you do is somewhat comparable to what is available in BFI, however you’re collecting data about many more file types, and you’re also collecting SHA1 values (BFI only does MD5 and SHA256).

Are these the basics of your system? Are there any other key points we might want to consider before heading down the road of attempting our own implementation?

I’m really sorry for the delay - approvals have been obtained and I’ll upload the code to a public-facing repo today and post a link. Your understanding is generally correct on all points. We only update daily, because we have no agent/driver that can update the SQLite on file writes, which would push this into the commercial EDR space.

BigFix deploys a Powershell script daily to subscribed endpoints. The script does the following:

  1. If an existing SQLite file exists, reads its hashes table into memory.
  2. Traverses the drive looking for “Interesting” file types. For each file found, if it’s new, or if the size or modification timestamp have changed since we last saw it (compare with structure in memory), hash the file.
  3. Commit the current in-memory structure to a SQLite table.

Once the SQLite files exist, BigFix analyses can query them easily with the sqlite inspector. We can equivalently do it ad-hoc via the Web UI.

Before implementing this yourself, you should ask yourself if daily hash updates are sufficient, or if you really need a continuously-updating EDR suite. This is a stopgap, a cheaper approach that doesn’t require the installation of another agent on the endpoints. Would I rather have something like Carbon Black? Sure. But that’s an additional big expense and another agent that may not be something you can deploy broadly.

Additionally, you should test extensively - I’ve pushed this for a week or so to a small test pool of machines and nobody noticed, but depending on your users and their usage patterns, the scanning I/O load may be impactful.

2 Likes

Thanks for the quick response here. Seems we did get the gist of your presentation quite well. A testimony to the compelling content and presentation style. :slight_smile:

So for our purposes, I think a daily view is sufficient, and it has us avoid the expense and hassle of deploying yet another agent-based application. And as for testing and focusing on a small test group, absolutely in our thoughts.

My main concern with the approach would be the end user impact with I/O. Besides the ongoing scans via deltas, we might also think about a slow ramp-up of the initial data collection. Perhaps limit the data collection to a number of new/updated files. That would reduce the initial completeness of the data, but might be an acceptable “poor-man’s” approach to throttling to reduce end-user impact. We’ll keep you updated if we do move forward with our own implementation.

1 Like

I’ve put the initial rough code into a public-facing repository at https://code.stanford.edu/secops-public/secops-posh/blob/master/Gather-InterestingFileHashes/Gather-InterestingFileHashes.ps1

Item to note: This code requires the System.Data.SQLite.dll and System.Data.SQLite.Linq.dll files from the SQLite distribution - we have BigFix drop those files in place along with the script prior to running. I have to run at the moment but I’ll add some documentation and the related BigFix actionscript when I get a chance.

4 Likes

I agree, the I/O is a primary concern.

There is one thing that could help a little, and mostly with CPU, is to run the script with bigfix in “low priority mode”. See this outdated, but useful example: https://bigfix.me/fixlet/details/3967

You can do this with the following actionscript: action launch preference low-priority

I believe this is windows only, and sadly would only help with CPU, and much less so for I/O.

prefetch Gather-InterestingFileHashes.ps1 sha1:d6a8c2e47444165bb9af9d7e57958953181563b6 size:8386 https://code.stanford.edu/secops-public/secops-posh/raw/00b552923457e69bf1376a8796364e53a0765961/Gather-InterestingFileHashes/Gather-InterestingFileHashes.ps1 sha256:87cd4adad572127a5cb61d1b17ed8f12440a486e201140274a69f63ca2dd9296
1 Like

@jtavan @jgstew Thanks guys. Some great code here to look at and play with.

Implementing this kind of solution is in my “Important not Urgent” bucket, so it’ll take me a little while to digest and possibly come back with a response on our ideas, implementation, etc. Generally, though, I like how this solution really takes advantage of things you can do with BigFix and Powershell that really give IT ops/SecOps a lot of power to collect and monitor systems. Of course there are drawbacks and caveats to be aware of, but I like the model of data collection AND storage on the endpoint to be later collected by BigFix.

Thanks again for the usable code here and I’ll let you know if and when we progress on our own possible prototypes and solutions.

@jtavan I’ve got a little time today to look into playing with the PS script you posted earlier. You mentioned that the code requires System.Data.SQLite.dll and System.Data.SQLite.Linq.dll. Do you have a typical distribution that you use or do you tailor it to the endpoint? I’m looking here:

http://system.data.sqlite.org/index.html/doc/trunk/www/downloads.wiki

There’s a ton of distros there, and I’m not quite sure the best, most “generic” choice I should make.

There are a lot of options. I believe I ended up using the 32-bit version for .NET 3.5 SP1 as the most universally-applicable, but it’ll vary depending on what you have deployed in your environment. The 32-bit version works fine from a 32-bit Powershell session kicked off by BigFix, on either 32-bit or 64-bit systems.