Best practice Very Large Deployment on RedHat

Considering installing the most recent version of BigFix using RedHat and DB2. Estimate the installed client base will be between 175 to 210 thousand clients. Will DB2 be able to support a deployment of this size? Are there any special considerations we should consider. The box will be virtual as physical hardware is Not a option.

RHEL / DB2 can handle the scale, the considerations aren’t so much the software as it is the network architecture and hardware performance.

For a deployment of that size, I’d suggest engaging Professional Services to consult. Contact your TA or private message me if you’d like some info on how to get started.

If you are going to proceed on your own, I’d strongly recommend a detailed read-through of the Capacity and Planning Guide at https://bigfix-mark.github.io/ which will help drill-down on what kind of performance you’ll need from your server and storage, and has some benchmark tools to help determine your capacity limits on whatever platform you are using.

I’d also point out that as of today, Insights is only an option on the SQL-based deployments, not on DB2.

There are a lot of considerations, listed in the Capacity guide, but at that scale storage speed is likely to be of most concern. All the deployments I’ve used at over 100k endpoints really benefit from RAID SSD storage - and if you can get NVMe SSD RAID that would be best, especially on the database storage.

Also plan to keep your Consoles close to the Root Server - we usually set up a Terminal Server to run the consoles and keep it on the same subnet as the root server itself to reduce console latency.

5 Likes

Thanks for the response, Insights not being supported on DB2 is a deal killer for me. UGH tried to save some money

@Aram , can we share whether there are plans for DB2 support on Insights?