SCA Version: 1.7.55
IEM Server 9.2.6
SQl 2012 Same server Installed
2016-01-14 10:20:29 (+0:00:51.685) ERROR: Duplicate Sentinel Fixlet(s) detected!
The fixlets; 'Applicability - Microsoft Windows Server 2008 R2', 'Applicability - Microsoft Windows Server 2008 R2', and 'Applicability - Microsoft Windows Server 2008 R2' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability - Microsoft Windows Server 2008', 'Applicability - Microsoft Windows Server 2008', 'Applicability - Microsoft Windows Server 2008', and 'Applicability - Microsoft Windows Server 2008' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability - Microsoft Windows Server 2012' and 'Applicability - Microsoft Windows Server 2012' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability - Microsoft Windows with Oracle 11g R2', 'Applicability - Microsoft Windows with Oracle 11g R2', and 'Applicability - Microsoft Windows with Oracle 11g R2' in the site 'Techsa Services' contain the same Sentinel ID.
The fixlets; 'Applicability Fixlet - RedHat Linux 6' and 'Applicability Fixlet - RedHat Linux 6' in the site 'Techsa Services' contain the same Sentinel ID.
2016-01-14 10:20:29 (+0:00:00.002) INFO: ETL Datasource task: from Data Source - SCM::Sentinel (0x0000000000000000 - 0x0000000002DFC065): Failed
2016-01-14 10:27:37 (+0:07:08.270) ERROR: Sequel::DatabaseError: Java::ComMicrosoftSqlserverJdbc::SQLServerException: Cannot insert duplicate key row in object 'scm.sentinels' with unique index
We were used to get this error messagge in SCA version 1.4 and 1.5. In SCA version 1.6 was released a fix to correct the sentinel duplicate fixlets.
I would suggest you to open a PMR. You shouldnât be getting this error in version 1.7
Please ensure that each of your custom checks have a unique scm-id MIME field. SCA 1.9 will enforce this uniqueness.
If the site looks OK, then it could be an issue with a desync between SCA stored data and the incoming BES data. In this case you could attempt the Server Settings -> Remediate import option to see if that resolves the issue.
OK, when I looked at it again I noticed the duplicate checking logic is flawed. So you should ignore everything it says about duplicates, other than the one that actually errors out the import (meaning the one that starts with something like ERROR: Sequel::DatabaseError: Java::ComMicrosoftSqlserverJdbc::SQLServerException.
You canât really tell how many actual duplicates you have since it will always error on the first duplicate, most likely not that many however.
I have both SQL and Session Relevance queries that can determine the duplicate x-fixlet-scm-id values and the fixlets involved. Iâm pretty sure theyâre already posted in this forum, Iâll try to find them.
I have followed this link before the upgrade. http://www-01.ibm.com/support/docview.wss?uid=swg219958161 and i already deleted duplicate scm checks. After the upgrade I click update schema and run a data import which is now failing. Now when i run the same database query I am getting below result.
In the duplicate check messages above the line, although they are printed without reason you can still use them to help debug by looking for the matching scm_id (in this case PCI-Win2008[183651784]) to at least see what sites it exists in.
To get the fixlet name you could try querying the SCA database to see if anything turns up, e.g.
SELECT * FROM scm.checks c WHERE c.id = 82032
You could also try this:
SELECT * FROM scm.check_fixlets cf JOIN datasource_fixlets f ON f.id = cf.datasource_fixlet_id JOIN scm.checks c ON c.id = cf.check_id WHERE cf.scm_id = 'PCI-Win2008[183651784]'
Both queries would only work if some related data was imported previous.
You could also try using the scm-id to find it in the console, but since this isnât visible in the console UI youâd need to export the fixlets to be able to check. If you managed to get the site name from above, this shouldnât be too bad, since you can export the entire site as a .bes then just string search to look for multiple matches.
((item 0 of it, name of item 1 of it, (id of item 0 of it, name of item 0 of it) of (fixlets of item 1 of it, item 0 of it) whose (mime field "x-fixlet-scm-id" of item 0 of it = item 1 of it)) of (unique values of mime fields "x-fixlet-scm-id" of fixlets of it, it) whose (multiplicity of item 0 of it != 1)) of bes custom sites
This Session Relevance should work in Web Reports, the BigFix Console Presentation Debugger, or via the API. It does not require that youâve imported anything into Compliance yet. Is this not working for you? It should return the values for x-fixlet-scm-id, Site Name, Fixlet ID, and Fixlet Names for all of the duplicated x-fixlet-scm-id values in every Custom Site.
Once those are known, you do still have to export them from the console and edit the x-fixlet-scm-ids in the exported XML and re-import; or, more likely, youâll find you donât need the duplicates anyway and can just delete them (in my case, an operator had imported an entire checklist twice into the same custom site, and every fixlet was duplicated; I was able to just sort them by modification times and delete the older versions).