Business Intelligence (BI) products & Big Fix

Curious if anyone here is using Business Intelligence (BI) products (Tableau, Crystal Reports, Yellowfin etc.) to directly or indirectly (via CSV or other file format) access Big Fix data for computer inventory examinations.

Or is anyone aware of a forum related to the use of Business Intelligence (BI) in conjunction with Big Fix data.

1 Like

Hi,

I’ve been doing some basic work on putting BigFix Data into ElasticSearch and visualizing data with Kibana. Is there a specific question you have?

Bill

1 Like

I was hoping that BigFix / IBM supported third party applications such as ‘Tableau’ or ‘Crystal Reports’ to access the data in the same way that IBM Web Reports does. Of course the advantage of using these BI tools is their advanced reporting functions ( grouping, sorting, conditional formatting, etc.) that IBM Reports does not have.
However, in reading through the article in the link below, it does not sound like this is recommended.
https://www.ibm.com/developerworks/community/wikis/home?lang=en#!/wiki/Tivoli%20Endpoint%20Manager/page/Querying%20BigFix%20Data

From the article:
Notes about querying the BigFix database:
The BigFix database is highly optimized for speed of insertion and it is not optimized for queries. BigFix only recommends querying the database Views. The underlying database tables are subject to change, but the Views are designed to stay static even if the underlying table structure changes.
Since the database tables are very simple and optimized for insertions, there are no foreign key constraints and no ERD diagrams for the BigFix database. Some of the data in the database is stored in XML and you will need to use stored procedures such as fn_ExtractField to access the data within the XML. You should consider using the SOAP API instead of database queries to avoid this problem.
Querying the BigFix database can result in table locking and can slow the whole BigFix system down! You should consider using the REST API or the SOAP API instead of database queries to avoid this problem.
BigFix recommends that you consider using the REST API or the SOAP API instead of database queries to allow easier-to-user queries that are faster and do not affect database performance.

Therefore, my assumption is that I am left with creating selection criteria via the “IBM Report Writer”, exporting the data to CSV file format and then importing the CSV data into a database table to allow Tableau or Crystal Reports to access this information and generate reports. A much more complex method than just linking to the data in the same manner that IBM Writer does.

Am I correct in my assumption?

1 Like

I have not seen anything referred to as, “IBM Report Writer” – can you help understand what you are referring to with that?

In my experience your best bet would be to do an ETL using the REST API and load the data in daily (or at some other frequency) to your other system.

Maybe someone else can chime in with what they’ve done but for the stuff I do, I just get a list of Computer IDs and this once for each computer in the environment:

(name of property of it, values of it) of results from (bes properties) of bes computers whose (id of it = "$ComputerID")

Taking the resulting data and converting it into JSON (or some other format) and then inserting it directly into the target datastore.

You should be able to, pretty easily with Powershell, just do it in one go – grab the info from the REST API, do any parsing, and insert it directly into SQL – I’m not sure if the CSV part is necessary.

3 Likes

“I have not seen anything referred to as, “IBM Report Writer” – can you help understand what you are referring to with that?” *** Should have read “IBM Web Reports”***

1 Like

We’re doing basically the same, and importing the results into ServiceNow.

By “doing the same thing”… do you mean using PowerShell together with REST APL ?

We are about to do this, but are still holding on which vendor will be the easiest to integrate with BigFix, and the other products we have.

Getting good API integration and ETL process is critical.

1 Like

In my case, we have a Java developer using the REST API to retrieve inventory and hardware asset information from BigFix Web Reports, and importing that data into ServiceNow.

1 Like

hey bro, quick question did you make any headway with the Kibana and bigfix data integration ?

Hi,

The company I work for sells a product based on this so I can give you general hints but I can’t share the solution as it’s a product now :slight_smile:

If you understand the BigFix REST API and how Elasticsearch documents work you should be able to work through the items required.

Thank you sir , since the purpose was for a report and i don’t really need to the see the data on the console , i used the excel connector to pull the report and it concatenates all “multiple” results into legible columns.

thank you for the speedy response

Thank you but we are not allowed to use the excel add on to accommodate… we need to write it in Session relevance.
:)…

I forgot we had a thread about BI here.

Since May we’ve bought a BI tool that does ETL and and visualization and we are moving into production extracting data from BigFix and other data sources to provide BI style reporting.

The warning from IBM about the direct queries are concerning, but so far we haven’t seen any issues. We are running at near-real time (import the data every few hours). The key part is getting the data from the views. Almost all of the information you’ll need is in the views.

However, there is a caveat (and I have brought this up to IBM multiple times in multiple locations, so far no changes). The data from the fixlet properties is BAD. It is not standardized or normalized. You can see this by looking at the CVEs. The separates change between sites, and sometimes there are extra spaces or missing spaces.

To get around this I got a python script that will grab the CSV generated by web reports (with site, fixlet, and CVE) and normalize the CVE column. Then I import that back into the BI tool and I have a clean fixlet to CVE table to use.

Feel free to reach out and I can share what I’ve done in private comms.

Chris

Hi Chris

Yes please can we take this PVT , i would like to know more about the solution.

Hi Chris, can you share what you have done. I believe you have the solution to the issue I am having regarding automating some of the process of getting data to Tableau.

thank you!