Best way to programmatically get Computer properties?

At my company we have a system that pulls multiple pieces of data from various tools so they can be consumed by others in a single place. Right now we have BigFix contributing many of the computer properties available via anaylsis results.

Currently the way we are implementing this data collection is through the BFI API accessing the computers report. This method generally works well. It’s straight forward, it doesn’t require logins (we’re using an API token attached to a service account), and it’s quick to update.

Now as the data on BigFix becomes more popular, we’re being asked to import more and more computer properties from BigFix so they are available via the BFI API. This is manageable, but it may get out of hand. We’re already at what I think is a top number of properties we SHOULD be importing into BFI.

So, I’m considering making the BigFix REST API available for data consumption, but we have concerns about the security model as well as the API potentially opening us up to performance issues.

What would be the best way to programmatically get computer properties out of BigFix? Additionally, are there good code examples (Python, Java, etc) that are easy to understand and emulate?

2 Likes

Definitely second this.

We’re in a similar situation where we have individuals pulling data from SOAP, but we’d prefer to have data centralized in Splunk to keep the requests to our server infrastructure as light as possible.

Documentation and examples of API calls would definitely be appreciated.

@jimmym
How are you guys doing token based authentication? This is one thing that has bugged me about the SOAP API calls we’re making; if you’ve got any tips you can share on this it’d be greatly appreciated.

@mwolff In BigFix Inventory, if you look at your user profile (https://BFIserver:port/account), you’ll see a field labled “API Token” and you can show that token. That token can be used in your BFI API calls and forgoes the need to login to BFI.

On the point of methods for exposing BigFix data, I was talking with a colleague and initially we thought we might want to generate a nightly replica of the BigFix database that can be used as a data source that’s isolated from the production system.

Then we realized we’re already doing database replication through DSA. So the database connected to the disaster recovery server might be a datasource option we already have available. We can use the Database API (https://developer.bigfix.com/other/database-api/) to extract the data we need.

We want to explore this more, but I’m interested to hear what others thing. Does this seem like a via option to folks? What caveats would there be?

2 Likes