SCA Import Issue

SCA Import Failed
IBM Bigfix Compliance

SCA Version: 1.7.55
IEM Server 9.2.6
SQl 2012 Same server Installed

2016-01-14 10:20:29 (+0:00:51.685) ERROR: Duplicate Sentinel Fixlet(s) detected!
The fixlets; 'Applicability - Microsoft Windows Server 2008 R2', 'Applicability - Microsoft Windows Server 2008 R2', and 'Applicability - Microsoft Windows Server 2008 R2' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability - Microsoft Windows Server 2008', 'Applicability - Microsoft Windows Server 2008', 'Applicability - Microsoft Windows Server 2008', and 'Applicability - Microsoft Windows Server 2008' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability - Microsoft Windows Server 2012' and 'Applicability - Microsoft Windows Server 2012' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability - Microsoft Windows with Oracle 11g R2', 'Applicability - Microsoft Windows with Oracle 11g R2', and 'Applicability - Microsoft Windows with Oracle 11g R2' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability Fixlet - RedHat Linux 6' and 'Applicability Fixlet - RedHat Linux 6' in the site 'Techsa Services' contain the same Sentinel ID.
2016-01-14 10:20:29 (+0:00:00.002) INFO: ETL Datasource task: from Data Source - SCM::Sentinel                      (0x0000000000000000 - 0x0000000002DFC065): Failed
2016-01-14 10:27:37 (+0:07:08.270) ERROR: Sequel::DatabaseError: Java::ComMicrosoftSqlserverJdbc::SQLServerException: Cannot insert duplicate key row in object 'scm.sentinels' with unique index

Hello,

We were used to get this error messagge in SCA version 1.4 and 1.5. In SCA version 1.6 was released a fix to correct the sentinel duplicate fixlets.
I would suggest you to open a PMR. You shouldn’t be getting this error in version 1.7

It is possible that you actually do have duplicate sentinels. Please inspect this “Techsa Services” site and make sure things are looking good.

1.7 is actually less aggressive in sentinel checks than 1.6.

done thanks

We have also done same…we have remove the “Techsa Services” site its working fine

Hi All,

After upgrading SCA from 1.8 to 1.9.70, We now have importing issue.

Error: Duplicate checks fixlet(s) detected! and it is showing all our custom checks are duplicate checks. TIA

Please ensure that each of your custom checks have a unique scm-id MIME field. SCA 1.9 will enforce this uniqueness.

If the site looks OK, then it could be an issue with a desync between SCA stored data and the incoming BES data. In this case you could attempt the Server Settings -> Remediate import option to see if that resolves the issue.

OK, when I looked at it again I noticed the duplicate checking logic is flawed. So you should ignore everything it says about duplicates, other than the one that actually errors out the import (meaning the one that starts with something like ERROR: Sequel::DatabaseError: Java::ComMicrosoftSqlserverJdbc::SQLServerException.

You can’t really tell how many actual duplicates you have since it will always error on the first duplicate, most likely not that many however.

1 Like

I have both SQL and Session Relevance queries that can determine the duplicate x-fixlet-scm-id values and the fixlets involved. I’m pretty sure they’re already posted in this forum, I’ll try to find them.

Edit: Duplicate x-fixlet-scm-id when installing Compliance 1.9

Hi Jason,

I have followed this link before the upgrade. http://www-01.ibm.com/support/docview.wss?uid=swg219958161 and i already deleted duplicate scm checks. After the upgrade I click update schema and run a data import which is now failing. Now when i run the same database query I am getting below result.

Hi Karlhe,

Remediate import option also failed. Thanks

Yeah sorry, please look at my second comment, I revised my suggestion.

If you need further clarification, let me know.

Hi Karlhe,

Below is the line where the data import encounters errors. Maybe you could suggest on how to resolved it. Thanks

2017-04-21 17:22:13 (+0:00:00.000) INFO: ETL Datasource task: from Datasource - SCM::CheckFixlet (0x0000000000000000 - 0x00000001609628F7): Failed
2017-04-21 17:22:13 (+0:00:00.016) ERROR: Sequel::DatabaseError: Java::ComMicrosoftSqlserverJdbc::SQLServerException: Cannot insert duplicate key row in object ‘scm.check_fixlets_persistent’ with unique index ‘scm_check_fixlets_persistent_check_id_scm_id_index’. The duplicate key value is (82032, PCI-Win2008[183651784]).
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(com/microsoft/sqlserver/jdbc/SQLServerException.java:216)
com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(com/microsoft/sqlserver/jdbc/SQLServerStatement.java:1515)
com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(com/microsoft/sqlserver/jdbc/SQLServerStatement.java:792)
com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(com/microsoft/sqlserver/jdbc/SQLServerStatement.java:689)
com.microsoft.sqlserver.jdbc.TDSCommand.execute(com/microsoft/sqlserver/jdbc/IOBuffer.java:5696)
com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:1715)
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(com/microsoft/sqlserver/jdbc/SQLServerStatement.java:180)
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(com/microsoft/sqlserver/jdbc/SQLServerStatement.java:155)
com.microsoft.sqlserver.jdbc.SQLServerStatement.execute(com/microsoft/sqlserver/jdbc/SQLServerStatement.java:662)
java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:508)

So from the error we have:

check_id = 82032
scm_id = PCI-Win2008[183651784]

In the duplicate check messages above the line, although they are printed without reason you can still use them to help debug by looking for the matching scm_id (in this case PCI-Win2008[183651784]) to at least see what sites it exists in.

To get the fixlet name you could try querying the SCA database to see if anything turns up, e.g.

SELECT * FROM scm.checks c WHERE c.id = 82032

You could also try this:

SELECT * FROM scm.check_fixlets cf JOIN datasource_fixlets f ON f.id = cf.datasource_fixlet_id JOIN scm.checks c ON c.id = cf.check_id WHERE cf.scm_id = 'PCI-Win2008[183651784]'

Both queries would only work if some related data was imported previous.

You could also try using the scm-id to find it in the console, but since this isn’t visible in the console UI you’d need to export the fixlets to be able to check. If you managed to get the site name from above, this shouldn’t be too bad, since you can export the entire site as a .bes then just string search to look for multiple matches.

Um did you check the link I had posted?

((item 0 of it, name of item 1 of it, (id of item 0 of it, name of item 0 of it) of (fixlets of item 1 of it, item 0 of it) whose (mime field "x-fixlet-scm-id" of item 0 of it = item 1 of it)) of (unique values of mime fields "x-fixlet-scm-id" of fixlets of it, it) whose (multiplicity of item 0 of it != 1)) of bes custom sites

This Session Relevance should work in Web Reports, the BigFix Console Presentation Debugger, or via the API. It does not require that you’ve imported anything into Compliance yet. Is this not working for you? It should return the values for x-fixlet-scm-id, Site Name, Fixlet ID, and Fixlet Names for all of the duplicated x-fixlet-scm-id values in every Custom Site.

Once those are known, you do still have to export them from the console and edit the x-fixlet-scm-ids in the exported XML and re-import; or, more likely, you’ll find you don’t need the duplicates anyway and can just delete them (in my case, an operator had imported an entire checklist twice into the same custom site, and every fixlet was duplicated; I was able to just sort them by modification times and delete the older versions).

Hi,

We have identified the 3 duplicate checks. I deleted them from the console but data import still fails with the same error. Thanks