Qradar not sending information to Bigfix

I’ve followed this document:

and configured the qradar adapter to send data to bigfix, however from the bigfix side, no data received.
Any one able to advice?
2016-05-20 11:02:50,569 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] Processing raw data from file /store/qvm/adaptor/bigfix/20160519.195709.000.2.testz.Full Scan.json.raw
2016-05-20 11:02:50,589 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] Processing 20160519.195709.000.2.testz.Full Scan
2016-05-20 11:02:50,589 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] Call getAssetUpdates {testz:2:1463659029000}
2016-05-20 11:02:50,589 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.QvmJdbcCanonicalDataSource: [DEBUG] About to retrieve asset updates for scan: testz, using SQL

      SELECT 
            DISTINCT( asset.Asset.id )                          AS "assetid",
            inet( interface_data.ipAddress )                    AS "ipaddress", 
            assetproperty_data.assetName                        AS "assetname", 
            COALESCE( count_data.risk_score, 0 )::NUMERIC(10,2) AS "asset_riskScore",
            cve_data.vulninstance_id                            AS "vulninstance_id",
            vuln_data.vuln_id                                   AS "vuln_id",
            COALESCE( vuln_data.risk_score, 0 )::NUMERIC(10,2)  AS "vuln_riskScore",
            COALESCE( cve_data.refValue, 'NO CVE' )             AS "cve_id",
            assetproperty_data.bigFixAgentId                    AS "bigfixagentid"
        FROM asset.Asset
        LEFT OUTER JOIN (
            SELECT 
                ip_agg.assetId assetId,
                trim( '#' from substring( string_agg( ip_agg.ipAddress, '#' ) FROM '[A-Z,a-z,0-9,.,:]+#?' ) ) ipAddress,
                ip_agg.ipCount ipCount
            FROM (
                SELECT
                    iface.assetId AS assetId,
                    CASE WHEN ip.ipv4Address IS NOT NULL THEN ip2Address( ip.ipv4Address ) ELSE ip.ipv6Address END AS ipAddress,
                    COUNT( CASE WHEN ip.ipv4Address IS NOT NULL THEN ip2address( ip.ipv4Address ) ELSE ip.ipv6Address END ) OVER assetIdWin AS ipCount,
                    GREATEST( ip.lastSeenScanner,ip.lastSeenProfiler,ip.created ) AS lastObserved,
                    MAX ( GREATEST ( ip.lastSeenScanner, ip.lastSeenProfiler, ip.created ) ) OVER assetIdWin AS lastObservedMax
                FROM asset.IpAddress ip  
                INNER JOIN asset.Interface iface ON ip.interfaceId = iface.id 
                WINDOW assetIdWin AS ( PARTITION BY iface.assetId )
            ) ip_agg
            WHERE 
             ip_agg.lastObserved = ip_agg.lastObservedMax 
             GROUP BY ip_agg.assetId,ip_agg.ipCount
        ) interface_data ON interface_data.assetId = asset.Asset.id
        LEFT OUTER JOIN (
            SELECT  
                asset.AssetProperty.assetId AS assetId,
                array_to_string( array_agg( CASE WHEN asset.AssetProperty.assetPropertyTypeId = ( SELECT id FROM asset.AssetPropertyType WHERE lower(typeName) LIKE 'unified name' ) THEN asset.AssetProperty.propertyValue ELSE NULL END ), ',') AS assetName,
                array_to_string( array_agg( CASE WHEN asset.assetProperty.assetPropertyTypeId = ( SELECT id FROM asset.AssetPropertyType WHERE lower(typeName) LIKE 'big fix agent id' ) THEN asset.AssetProperty.propertyValue ELSE NULL END ), ',') AS bigfixAgentId
            FROM asset.AssetProperty
            GROUP BY asset.AssetProperty.assetId
        ) assetproperty_data ON assetproperty_data.assetId = asset.Asset.id 
        LEFT OUTER JOIN  (
            SELECT  
                v.assetId                                               AS asset_id, 
                count( DISTINCT v.vulnId )                              AS vulnerability_count,
                SUM( asset.VulnInstanceStatistics.adjusted_risk_score ) AS risk_score
            FROM asset.VulnInstance v
            LEFT OUTER JOIN asset.VulnInstanceStatistics ON v.id = asset.VulnInstanceStatistics.vulninstanceId
            WHERE ( CASE WHEN v.lastScannedFor IS NULL THEN DATE('1900-01-01') ELSE v.lastScannedFor END ) <= v.lastSeen 
            AND v.id NOT IN ( SELECT vulninstance_id FROM exception_rule.vuln_mgt_vulninstance WHERE NOW() <= except_until_date )
            GROUP BY asset_id
        ) count_data on count_data.asset_id  = asset.Asset.id 
       LEFT OUTER JOIN (
            SELECT  
                v.assetId                                               AS asset_id, 
                v.vulnId                                                AS vuln_id,
                v.id                                                    AS vulninstance_id,
                SUM( asset.VulnInstanceStatistics.adjusted_risk_score ) AS risk_score
            FROM asset.VulnInstance v
            LEFT OUTER JOIN asset.VulnInstanceStatistics ON v.id = asset.VulnInstanceStatistics.vulninstanceId
            WHERE ( CASE WHEN v.lastScannedFor IS NULL THEN DATE('1900-01-01') ELSE v.lastScannedFor END ) <= v.lastSeen 
            AND v.id NOT IN ( SELECT vulninstance_id FROM exception_rule.vuln_mgt_vulninstance WHERE NOW() <= except_until_date )
            GROUP BY v.assetId, v.vulnId, v.id
        ) vuln_data on vuln_data.asset_id = asset.Asset.id 
        LEFT OUTER JOIN  (
            SELECT DISTINCT( v.id ) AS vulninstance_id,
                   erv.refValue     AS refValue
        FROM ExtRef AS er
            INNER JOIN ExtRefValue erv ON er.extRefValueId = erv.extRefValueId
            INNER JOIN ExtRefType  ert ON erv.extRefTypeId = ert.extRefTypeId
            INNER JOIN asset.VulnInstance v ON v.vulnid = er.vulnid
            WHERE ert.extRefTypeId = 3
            GROUP BY v.id, erv.refValue
        ) cve_data on cve_data.vulninstance_id  = vuln_data.vulninstance_id
    
 
    
        RIGHT OUTER JOIN (
        SELECT cs.asset_id,
               ( SELECT COUNT( id ) 
                 FROM asset.VulnOnAssetScan 
                 WHERE asset_scan_id = cs.id 
                 AND is_found = false ) AS cleared
        FROM asset.CompletedAssetScan AS cs
        LEFT JOIN asset.ScanConfig sc ON cs.scan_config_id = sc.id
        WHERE cs.scanner_id = ?
        AND sc.scan_config_name = ?
        AND sc.config_type = ?
        ) custom_data ON custom_data.asset_id = asset.Asset.id
    
 
    
        WHERE assetproperty_data.bigFixAgentId != ''
        AND cve_data.refValue != 'NO CVE'
        AND ( COALESCE( count_data.risk_score, 0 )::NUMERIC(10,2) >= ?
              OR COALESCE( vuln_data.risk_score, 0 )::NUMERIC(10,2) >= ?
              OR custom_data.cleared > 0 )
    
 AND asset.Asset.id = ANY( string_to_array( ?, ',' )::BIGINT[] ) ORDER BY asset.Asset.id

2016-05-20 11:02:51,944 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.AssetUpdateBuilder: [DEBUG] Asset vuln update limit is 300
2016-05-20 11:02:51,944 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.QvmJdbcCanonicalDataSource: [DEBUG] Asset update retrieved 0 updates
2016-05-20 11:02:51,944 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] Received 0 asset updates
2016-05-20 11:02:51,944 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor: [DEBUG] StoreCanonicalUpdates updateCount=0
2016-05-20 11:03:04,033 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] No data received on plugin publish queue, shutdown=false
2016-05-20 11:03:04,033 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] Read plugin publish queue
2016-05-20 11:03:34,033 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] No data received on plugin publish queue, shutdown=false
2016-05-20 11:03:34,034 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] Read plugin publish queue
2016-05-20 11:03:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.SourceDataDistributor: [DEBUG] Source data distributor timeout
2016-05-20 11:03:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.SourceDataDistributor: [DEBUG] SourceDataDistributor.distribute source=/store/qvm/adaptor/data
2016-05-20 11:03:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.utils.FileUtils: [DEBUG] ReadDataFiles suffix=.json dir=/store/qvm/adaptor/data
2016-05-20 11:03:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.SourceDataDistributor: [DEBUG] Processed 0 files, runForever is false
2016-05-20 11:03:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.SourceDataDistributor: [INFO] Start shutdown timer
2016-05-20 11:03:49,031 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] Process raw data files
2016-05-20 11:03:49,031 [pool-2-thread-1] com.q1labs.qvm.adaptor.utils.FileUtils: [DEBUG] ReadDataFiles suffix=.raw dir=/store/qvm/adaptor/bigfix
2016-05-20 11:03:49,031 [pool-2-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] There are 0 raw data files in /store/qvm/adaptor/bigfix
2016-05-20 11:04:04,034 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] No data received on plugin publish queue, shutdown=false
2016-05-20 11:04:04,034 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] Read plugin publish queue
2016-05-20 11:04:34,034 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] No data received on plugin publish queue, shutdown=false
2016-05-20 11:04:34,035 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] Read plugin publish queue
2016-05-20 11:04:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginDriver: [INFO] StopOnComplete pluginCount=1
2016-05-20 11:04:34,039 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.SourceDataDistributor: [INFO] Stop data distributor
2016-05-20 11:04:34,040 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor: [INFO] StopOnComplete plugin processor for bigfix
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor: [INFO] Wait on reader service shutdown, plugin=bigfix
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] Process raw data files
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.utils.FileUtils: [DEBUG] ReadDataFiles suffix=.raw dir=/store/qvm/adaptor/bigfix
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$CanonicalDataProducer: [DEBUG] There are 0 raw data files in /store/qvm/adaptor/bigfix
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.bigfix.BixfixSession: [INFO] Close session
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginDriver: [INFO] Wait on shutdown latch
2016-05-20 11:04:34,041 [pool-1-thread-1] com.q1labs.qvm.adaptor.plugin.PluginDriver: [INFO] StopOnComplete: shutdown complete
2016-05-20 11:05:04,035 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [DEBUG] No data received on plugin publish queue, shutdown=true
2016-05-20 11:05:04,035 [pool-3-thread-1] com.q1labs.qvm.adaptor.plugin.PluginProcessor$PluginDataPublisher: [INFO] Data Publisher shutting down on request

can you please verify that logging is enabled in the log4j.xml file. Log files are
produced in: /var/log/qvm-integration-adaptor.log and
/var/log/qvm-adaptor-cron.log files. please provide these logs if logging is enabled

Hi patriot3w,

   Just a few tips on checking things on the bigfix side.
  1. Browse to https://localhost:52311/api/dashboardvariables/QRadarScan.ojo & if QRadar has sent data it will be stored under here in dashboard variables with names made up of the QRadar Scan name & a date time stamp. If there is nothing here then you know for sure QRadar has not successfully posted data to bigfix.

  2. Check the qrplugin is processing data ok. Check the QRPlugin logs files normally stored under C:\Program Files (x86)\BigFix Enterprise\BES Server\Applications\Logs. They’ll have a name like qrplugin.24_03_2016.log.

  3. Change the log level for the qrplugin. Open the C:\Program Files (x86)\BigFix Enterprise\BES Server\Applications\qrplugin\properties.ini file. At the top will be an entry logLevel.level=info. Place a hash in front of that line and remove the one in front of the debug entry. You’ll need to restart the qrplugin. The easiest way to do this is open up the task manager & kill the QRadarNode.exe process. After a minute or two the Bes Plugin Service will realize this process has died and it will restart it & it will now output debug info to the log file.

Hope that helps.

1 Like

Forgot to say make sure to use a Master Operator when querying the Bigfix Rest API https://localhost:52311/api/dashboardvariables/QRadarScan.ojo

  1. Not able to open this link as i think the file not created. But https://localhost:52311/api/dashboardvariables able to open.
  2. Yes, there is logs
  3. No data from Qradar

{“level”:“info”,“message”:“No new data detected from QRadar”,“timestamp”:“5/20/2016, 8:23:55 PM”}
{“level”:“info”,“message”:“No new data detected from QRadar”,“timestamp”:“5/20/2016, 8:28:55 PM”}
{“level”:“info”,“message”:“No new data detected from QRadar”,“timestamp”:“5/20/2016, 8:33:55 PM”}

/var/log/qvm-integration-adaptor.log - see the logs in the post, how can i attach file?
/var/log/qvm-adaptor-cron.log – no data, 0 byte

I’d recommend you open a PMR to get support for QRadar - it looks like you issues are with QRadar and not with BigFix.

Please use the icon to upload the log file.

There is a limitation in uploading files to this portal, you cannot upload .txt or .log files.
he can either upload it in a free cloud and share that link with you.

I agree with gearoid. If you did not see QRadaScan.ojo under https://localhost:52311/api/dashboardvariables this means QRadar has not successfully sent data.

Yes, i will open a PMR to check on this.

I’ve uploaded to dropbox:

Bigfix able send the patch info to Qradar, but Qradar not able to receive. Let me check with support then.

1 Like

On a slightly side topic - has anyone integrated the BigFix as a Log Source? Not the patching/vulnerability information but the actual BigFix Activity logging.
In Qradar you can simply add it as a log source - enter the creds - point it to the webreports and its supposed to work.

The ONLY logs i have received are “User login Success” - that’s it. No action, no login failure, nothing. I have an open PMR with IBM but no one has been able to tell me if this integration has worked for other IBM customers? If yes - then what are the events that you get?

Hi, have you been able to resolve this issue? I am facing a similar issue as well. Look forward to your response. TIA

I think the issue resolved after upgrade the qradar(some of the rpm packages) , only after certain version can.