Query for folder size

Does this query make sense? Its getting timed out to get the folder size if its present or not

concatenation “:” of unique values of (it as string as trimmed string) of (pathnames of it, ((sum of sizes of descendants of it)/1024/1024) as string & “MB” ) of folders whose ( pathname of it contains regex “tivoli|IBM|tiaa-itm*”) of folder “/opt” as string

I can get the size of folder via “du” under /opt/IBM,tivoli, itm. The query seems tripping at /opt/tivoli.
The debug message is not of help either.

Wed, 22 Apr 2020 12:35:37 -0400 DebugMessage [ThreadTime:12:35:36] QueryReportManager - Attempting to post report
Wed, 22 Apr 2020 12:35:37 -0400 DebugMessage [ThreadTime:12:35:37] QueryReportManager DoUploadFile: None authentication used.
Wed, 22 Apr 2020 12:35:37 -0400 DebugMessage [ThreadTime:12:35:37] BigFix Query Report posted successfully
Wed, 22 Apr 2020 12:35:37 -0400 DebugMessage [ThreadTime:12:35:37] QueryReportManager - report sent successfully.
Wed, 22 Apr 2020 12:35:37 -0400 DebugMessage [ThreadTime:12:35:37] Query with id 6261 completed the processing, but an error occurred
W

It makes sense that it would time out as

Is potentially very costly operation. How many descendants does that folder have?
You can use the setting _BESClient_Query_MOMaxQueryTime (_BESClient_Query_NMOMaxQueryTime) to control the timeout for a master (non-master) operator but I would advise against using anything much more than 300 seconds.

Is there maybe a different way to get at the same data you are looking for?

/opt may have 5-10 folder but I am specifically looking for the size of /opt/IBM, /opt/tivoli, and /opt/itm. I have set the query timed out to 900sec from the webgui.

I think the main question would be how many files/folders you have under /opt/IBM, /opt/tivoli, and /opt/itm. This is what is mostly likely causing the timeout.

nah its small size folder just 70M others are 100M . anyway It worked using analysis. UdP query must be timing out.

number of files, not total size of the files, would be the expensive thing here.

I agree. If the number of descendants is already known then the expense isn’t too bad, but if you are unsure how many directories or files there are, it’s best to use some script to loop over the directory for the information instead of trying to use relevance.