ServiceNow Integration not Working

We have configured and Installed the new BigFix Integration Service in order to make usage of the BigFix-ServiceNow Integration.

We have mapped the BigFix with the ServiceNow Fields.
The verification process passed successfully and in fact we can see the process is querying both the BigFix REST API and the SNOW API.

However, the log for the DataFlow of ServiceNow shows an error when processing Changes in the Target Adapter:

2020-07-02 11:36:14,579 - DataFlowRunner - ERROR - Discontinuing Execution of DataFlowRunner.

2020-07-02 11:36:14,579 - DataFlowRunner - ERROR - 1


After that the log only shows:

2020-07-02 11:39:02,071 - DataFlow - DEBUG - Processing Changes In Target Adapter


And nothing else seems to happen.
I have gone trough the other logs, but I have not found anything to point me to the problem. The only thing I have found is that the pk mean to be INT type data but in our environments the unique identifiers are STRING Data.
Could be that the reason?

This may be related, yes…I will check internally. If you haven’t done so already, I’d also recommend opening a Support case on this item. One thing they will request is a copy of your configuration file (xml).

We have an open case with support. I have been able to setup the connection and send data to ServiceNow.
However, we are not able to sync data coming from ServiceNow. If we update one of the fields on the servicenow side, we are not seeing this data in BigFix.

One error I have noted is the following:

2020-07-07 11:39:54,315 - DataUpdater - DEBUG - Schema Mapping Complete
2020-07-07 11:39:54,315 - DataUpdater - DEBUG - Loading Mailboxes From Binary File
2020-07-07 11:39:54,316 - DataUpdater - DEBUG - File Not Found: Mailboxes-1215d37ed23a3b2e0a64ff1202dcbc07.dat
2020-07-07 11:39:54,316 - DataUpdaterTaskHandler - INFO - Processing Updates
2020-07-07 11:39:54,316 - DataUpdaterTaskHandler - INFO - No Pending Tasks Found.


Support has not made any comments on this.

We are having issues as well. The analysis that gets created in the Service Now CMDBAttributes site isn’t relevant to any devices.

I have a case open that has been going for 15 days now. :frowning:

Something else we have noticed is that the integrationservices.exe processes do not stop when the service is stopped. they will keep going until you kill them manually.

Still working on the issue.We were able to sync data once, but if there is a change in the servicenow side, BigFix won’t receive those updates.

There is a known issue, the --Stop option does not finish all process. You need to kill them all manually.

I’ve recently begun setting up ServiceNow Day Flow in my environment.
Since you have it working in one direction, I wanted to check that the credentials you’re using is a Master Operator account? The method it uses to post files to the computer mailbox sites requires master operator.

I also think there could be something odd about the ‘weight’ values assigned but I’m still working through it. It could be worth removing the ‘weight’ values on the ServiceNow to BigFix flow and see whether that makes a difference.

You may also try removing the cached .dat files that are generates in the integration directory, to force it to start over.

Can you post your XML (minus the server URLs and credentials)?

Yes, we are using MO creds and also have the service running as a service account with mo rights. Our XML is out of the box without modifications except for the credentials and url’s of course.

Yeah, I also cleared out the dat and log files a few times. We have 4 analyses that have been created but none of the mailbox files exist so they are not relevant. Do the service now credentials need rights in BigFix? I assume not but at this point we are grasping at straws…

You need to update the XML with the adjustments based on your environment.

After one month of try and failure, we have not been able to get the Integration working successfully.
These are our observations/issue so far:

  • It’s not possible to use a custom CMDB ServiceNow table(According to support this will be supported in next release).

  • We can’t use a different Identifier for the systems other than the BigFix Computer ID.

  • The stop/start functionalities of the IntegrationService.exe don’t work properly, we need to kill the processes manually, and we have been told that the wrapper could lose the encrypted credentials so we need to set them before re-starting the service.

  • There is not clarity about how the data persistence is ensured. We have been doing manual deletes of the .dat files.

  • Mailboxes are not updated with data that has been updated in ServiceNow.

  • And the most annoying issue: The integration has created duplicate records in ServiceNow, fortunately we are using sandbox environments and only about 10 devices, I can’t imagine if this happened in Production Environment.

We have an open case with HCL but that has not been enough to get this working.

We are starting looking at different options, and just drop the adoption of the BigFix Integration Service out of our project.

Could someone from HCL give us a real solution to this?

1 Like

Private message sent, I’ll see where we can help.

We are having the exact same issues here with an ongoing open case. It’s very frustrating.
I really want this to work but I have to wonder what kind of QA was done prior to release. Good thing we have a dev environment for this and we didn’t put it in prod. :confused:

1 Like

With the help of Jason we are now at the point where the Integration is working and it is stabilized.
We have resolved most of the issues I reported in this thread. I am still testing different scenarios and functionalities.

I can say it works as expected.

2 Likes

Thanks again for the help Jason.

Our issue is that the mailbox files never made it down to the endpoints and this broke the correlation, thus causing us duplicates. I modified it yesterday but the problem persisted at first… it seemed to remediate overnight.

Here are the bits of XML we added to get the data flow working:

<dataflow displayname=“Endpoint data from BigFix To ServiceNow” minimumcorrelationconfidencelevel=“55” executionintervalinminutes=“360”>

<property displayname=“Computer Name” columnname=“BigFix ServiceNow Data Flow-Hardware Attributes-2” datatype=“string” weight=“75”/>

<property displayname=“Name” columnname=“name” datatype=“string” weight=“75”/>

<dataflow displayname=“Endpoint data from ServiceNow to BigFix” minimumcorrelationconfidencelevel=“55” executionintervalinminutes=“360”>

Also of note is duplicate entries in service now when CI’s already exist. The fixpack is targeted for the 2nd half of 2021 a this time.
The issu4e can be tracked at the following URL :
https://support.hcltechsw.com/csm?id=kb_article&sysparm_article=KB0085217

1 Like