BFI Data Import failing - "The file system is full" - ETL: InventoryBuilder

Hi.
We have been struggling to resolve this issue - have a case open for over a month. Multiple accounts impacted.

BFI: v9.2.11
DB@: v10.5 FP10
Endpoints: 5900
DB size: ~30GB
DB (data) Space allocated: ~390GB
Data Sources: 3

Error in Import Log:

2018-11-14 05:22:00 (+0:00:00.002) INFO: ETL from InventoryBuilder: Start
2018-11-14 05:36:14 (+0:14:14.268) ERROR: (ImportThread) SQLException -> Message : createTemporaryTable for table adm.discovered_software_details
-> SQL message : The file system is full… SQLCODE=-968, SQLSTATE=57011, DRIVER=3.72.24
-> SQL errorcode : -968
-> SQL state : 57011 com.ibm.db2.jcc.am.SqlException: The file system is full… SQLCODE=-968, SQLSTATE=57011, DRIVER=3.72.24

Initial allocated space for DB data was around 70GB. We kept throwing more space at it - with same error.

It was discovered that the DB Self Tuning Memory was set to OFF. This was turned ON and the DEFAULT buffer pool size was changed to AUTOMATIC. Still same issue.

Apparently the DB is configured in a non-BFI-standard manner - but we have not idea what to check for. So, waiting for L3 to further assist.

In the meanwhile, thought I check to see if anyone else has experienced this issue and/or have some ideas on how to approach troubleshooting/resolution.

Thank you,
-nb

Is this the first import?
you may monitor growth of files in DB Data directory to identify which one of the DB files is causing you problems and then move from there to talk to support.

I noticed that initial import may temporary require much more space than following imports.

Hi. Thanks

No. It is not the first import.
The issue occured in August - Added some space. Has been sporadic since.
However, starting end of September, the error is consistent. Have been adding more and more space. Somehow it was successful once in October. Have added quite a bit of space after that - to no avail.

-nb

have you changed anything in default db2 configuration?

Hi, Michal.
We were told that the database does not have default settings. Self Tuning Memory needs to be turned on, and to make sure that there are no limits on pools and tables.

DBA tuned on the Self Tuning memory parameter to ON (it was off) and also found that the default buffer pool was not set to Automatic. The default buffer pool size was then set to automatic.

Other than that, we are not sure what other tables and pools we need to check to make sure that they are not being limited. This is part of support response:
"When your DBA sets the parameter to OFF and limited pools, he affected not only the single parameter, he also set limitations for a number of BFI tables (couldn’t do this with the parameter set to ON). Therefore setting back the parameter to ON for self tuning memory is not enough, DBA should also set the default values for all BFI’s tables he modified before.

It is good that you switch on the automatic configuration, but you still need to set the BFI tables parameters to default configuration to stop limiting pools."

… It is not clear what we need to look for.

is DB2/BFI on the same system as the DB2/BigFix server?
I found that there are situations where the BFI database still has space but the file system that goes full during an import is actually the DB2 filesystem on BigFix server.
Verify the details in the db2 log and with db2diag on both BFI and BigFix databases.
Andrea

Thanks, Kapax.
BFI and it’s DB2 database is on one server.
It connects to 3 data sources - the BF and BF databases for these data sources are each, on a separate server. The BF databases are MS-SQL databases and space is not an issue on the BF servers.

Rgds,
-nb

*** Fixed ***
Turns out (as was intimated earlier) that the DB2 settings were not “standard” or “default” required by the BFI application. Unable to determine what these settings are supposed to be, we:

  • used db2look to get object information for an equivalent “working” BFI deployment
  • used db2look to get object information for this “non-working” BFI
  • compared the 2 to determine what was different in the “non-working” DB (found about 10 differences)
  • updated the objects in the “non-working” DB to match equivalent ones in the “working” DB
  • ran data imports to confirm - result=success

Above steps took about 30 minutes.

Unfortunately, the root cause of the issue could not be determined. However, after almost 2 months without resolution, the priority was to get it working.

Thank you all, for your responses.

-nb

Hi

I am having the same problem, which DB2 settings is recommended.

DB2 version : 10.5.0.9
BFI version : 9.2.14.0

During import the step InventoryBuilder takes up all available space. It turns out it is for TEMPSPACE1, which takes up about 330 GB. Our dba’s found that it was the statement below, that coused the problem :

insert into adm.ff_to_process select distinct ff.id, comp.id from sam.signature_matches_scd sms inner join sam.signature_roots s on sms.signature_id = s.root_id inner join sam.software_components comp on comp.guid = s.discoverable_guid inner join sam.file_rules fr on s.id = fr.signature_id inner join sam.file_rule_matches_persistent frmp on fr.id = frmp.file_rule_id inner join sam.file_rule_matches_scd frms on frms.id = frmp.id and frms.computer_id = sms.computer_id and frms.valid_to = ‘9999-12-31 23:59:59.997’ inner join sam.file_facts_scd ff on frms.computer_id = ff.computer_id and frmp.file_fact_id = ff.id and ff.valid_to = ‘9999-12-31 23:59:59.997’ with ur

There has no changes been made on our database. the upgrade to current BFI version was made 2 month ago and the problem just occurred 2 days ago.

Help appreciated

Regards

Opened a case on the problem, the recommendation was to upgrade to newest version (9.2.15.0), which I did and it solved the problem :slightly_smiling_face:.

According to support, there have been made some enhancements to this part of the import/sql.

1 Like