We have configured and Installed the new BigFix Integration Service in order to make usage of the BigFix-ServiceNow Integration.
We have mapped the BigFix with the ServiceNow Fields.
The verification process passed successfully and in fact we can see the process is querying both the BigFix REST API and the SNOW API.
However, the log for the DataFlow of ServiceNow shows an error when processing Changes in the Target Adapter:
And nothing else seems to happen.
I have gone trough the other logs, but I have not found anything to point me to the problem. The only thing I have found is that the pk mean to be INT type data but in our environments the unique identifiers are STRING Data.
Could be that the reason?
This may be related, yesâŚI will check internally. If you havenât done so already, Iâd also recommend opening a Support case on this item. One thing they will request is a copy of your configuration file (xml).
We have an open case with support. I have been able to setup the connection and send data to ServiceNow.
However, we are not able to sync data coming from ServiceNow. If we update one of the fields on the servicenow side, we are not seeing this data in BigFix.
Something else we have noticed is that the integrationservices.exe processes do not stop when the service is stopped. they will keep going until you kill them manually.
Iâve recently begun setting up ServiceNow Day Flow in my environment.
Since you have it working in one direction, I wanted to check that the credentials youâre using is a Master Operator account? The method it uses to post files to the computer mailbox sites requires master operator.
I also think there could be something odd about the âweightâ values assigned but Iâm still working through it. It could be worth removing the âweightâ values on the ServiceNow to BigFix flow and see whether that makes a difference.
You may also try removing the cached .dat files that are generates in the integration directory, to force it to start over.
Can you post your XML (minus the server URLs and credentials)?
Yes, we are using MO creds and also have the service running as a service account with mo rights. Our XML is out of the box without modifications except for the credentials and urlâs of course.
Yeah, I also cleared out the dat and log files a few times. We have 4 analyses that have been created but none of the mailbox files exist so they are not relevant. Do the service now credentials need rights in BigFix? I assume not but at this point we are grasping at strawsâŚ
After one month of try and failure, we have not been able to get the Integration working successfully.
These are our observations/issue so far:
Itâs not possible to use a custom CMDB ServiceNow table(According to support this will be supported in next release).
We canât use a different Identifier for the systems other than the BigFix Computer ID.
The stop/start functionalities of the IntegrationService.exe donât work properly, we need to kill the processes manually, and we have been told that the wrapper could lose the encrypted credentials so we need to set them before re-starting the service.
There is not clarity about how the data persistence is ensured. We have been doing manual deletes of the .dat files.
Mailboxes are not updated with data that has been updated in ServiceNow.
And the most annoying issue: The integration has created duplicate records in ServiceNow, fortunately we are using sandbox environments and only about 10 devices, I canât imagine if this happened in Production Environment.
We have an open case with HCL but that has not been enough to get this working.
We are starting looking at different options, and just drop the adoption of the BigFix Integration Service out of our project.
Could someone from HCL give us a real solution to this?
We are having the exact same issues here with an ongoing open case. Itâs very frustrating.
I really want this to work but I have to wonder what kind of QA was done prior to release. Good thing we have a dev environment for this and we didnât put it in prod.
With the help of Jason we are now at the point where the Integration is working and it is stabilized.
We have resolved most of the issues I reported in this thread. I am still testing different scenarios and functionalities.
Our issue is that the mailbox files never made it down to the endpoints and this broke the correlation, thus causing us duplicates. I modified it yesterday but the problem persisted at first⌠it seemed to remediate overnight.
Here are the bits of XML we added to get the data flow working:
<dataflow displayname=âEndpoint data from BigFix To ServiceNowâ minimumcorrelationconfidencelevel=â55â executionintervalinminutes=â360â>
<property displayname=âComputer Nameâ columnname=âBigFix ServiceNow Data Flow-Hardware Attributes-2â datatype=âstringâ weight=â75â/>