Can this relevance be made more efficient, or is looking around on the hard drive for a file just slow and inefficient?
(name of it, version of it|(“0.0.0.0” as version), pathnames of it) of descendants whose (name of it as lowercase = “sqlite3.dll”) of (folder "C:\Program Files"; folder "C:\Program Files (x86)"; folders “AppData” of folders of folder "C:\Users")
Along the same lines as @atlauren suggests, I’ve used a crude but effective DOS command via ActionScript to search for a named file across all fixed drives with good effect, though with WMIC becoming deprecated, not sure how long his has much life to if.
for /f "tokens=2 delims==:" %%d in ('wmic logicaldisk where "drivetype=3" get name /format:value') do @dir %%d:\zz.exe /s /b >> C:\searchresults.txt
I then use relevance to parse the text file either as a property or as detection logic for a follow up fixlet.
Why dump to a file and read the file? Why not just an analysis that runs once a day/week?
My constant concern in “dump to a file” situations is that the property that reads the file needs to be written so that it checks the file date to make sure the results are recent. If it’s something that relevance can’t report (like the output of a command), there’s no choice. This, however, is something that relevance can handle easily. So, why?
The main reason is that native relevance isn’t the most efficient when searching deep folder structures and can impede client operations. I do also recall way back in earlier versions, I saw it mentioned somewhere that there was a hard limit in how long a property was allow to run for before aborting, which at the time was something like 10 seconds. Not sure if that is still the case, maybe our esteemed HCL gurus can comment. If it is still the case, even if the hard limit was higher, its possible a file search could time out before the inspection has completed, thus give no results. Running a file search via PowerShell, batch etc as a detached process, this will not impede the Bigfix agent.
That all said, if the search isn’t a deep folder hierarchy with lots of files, you may be able to use relevance with a much smaller impact.
In this case, PowerShell’s built-in -filter is extremely useful.
-Filter
Specifies a filter to qualify the Path parameter. The FileSystem provider is the only installed PowerShell provider that supports filters. Filters are more efficient than other parameters. The provider applies filter when the cmdlet gets the objects rather than having PowerShell filter the objects after they’re retrieved.
In the action’s relevance, include a timestamp check for the output file itself. (not exist (folder "c\magicdir") whose (exist (file "file" of it) whose ( (now - (modification time of it) < (7 * day) ) AND (size of it > 0) ) ))
Leave the action open, running whenever it becomes relevant again.
In the analysis, have the property refresh more frequently. Say, daily. The file only has strings, so extracting the name and version is extremely lightweight.
Not to pile on, but I once attempted this across a list of 110 files based on MD5 hashes.
Each file I was searching for required an average of 13 minutes of client time to resolve the answer. It would have taken hours for the full results - Hence why it’s a bad idea to use the agent this way. We ended up using a utility that dumped output to a file that the analysis read (very quickly).
If you have installed the scanner and it is running on your endpoints, then try out this relevance:
q: (file (it)) of (column "path" of it as string & "\" & column "name" of it as string) of rows of statements "select path, name from api_fsmon_file_info where name like '%25sqlite3.dll%25'" of sqlite database of files "bf-scanner.db" of (folder "tools\scanner\db" of parent folder of client|folder "tools\scanner\db" of parent folder of regapp "besclient.exe")
A: "sqlite3.dll" "3.45.3.0" "SQLite3" "3.45.3.0" "SQLite3"
A: "SQLitePCLRaw.provider.e_sqlite3.dll" "1.0.0.0" "SQLitePCLRaw.provider.e_sqlite3" "1.0.0.0" "Zumero, LLC"
A: "e_sqlite3.dll" "" "" "" ""
A: "e_sqlite3.dll" "" "" "" ""
A: "e_sqlite3.dll" "" "" "" ""
A: "SQLitePCLRaw.provider.e_sqlite3.dll" "1.0.0.0" "SQLitePCLRaw.provider.e_sqlite3" "1.0.0.0" "Zumero, LLC"
A: "sqlite3.dll" "3.39.4.0" "SQLite3" "3.39.4.0" "SQLite3"
Thanks @ssakunala for the help on this new relevance!
Breaking down the major parts, since this is all new:
parent folder of client is not what you expect in the relevance debugger, so using regapp “besclient.exe” for debugging
q: sqlite database of files "bf-scanner.db" of (folder "tools\scanner\db" of parent folder of client|folder "tools\scanner\db" of parent folder of regapp "besclient.exe")
A: sqlite database: C:\Program Files (x86)\BigFix Enterprise\BES Client\tools\scanner\db\bf-scanner.db
T: 4.881 ms
q: rows of statements "select path, name from api_fsmon_file_info where name like '%25sqlite3.dll%25'" of sqlite database of files "bf-scanner.db" of (folder "tools\scanner\db" of parent folder of client|folder "tools\scanner\db" of parent folder of regapp "besclient.exe")
now with the raw rows, you can use the inspectors to get the path, the file name and then put it together to a full filepath, then feed that string into the file inspector to get each of the file objects without needing to search in the relevance.
q: (file (it)) of (column "path" of it as string & "\" & column "name" of it as string) of rows of statements "select path, name from api_fsmon_file_info where name like '%25sqlite3.dll%25'" of sqlite database of files "bf-scanner.db" of (folder "tools\scanner\db" of parent folder of client|folder "tools\scanner\db" of parent folder of regapp "besclient.exe")
I didn’t clearly describe my concern regarding the recent-ness of the property results. My undying (and possibly completely irrational) fear in the “run a script, create a file, read the file” scenario is that the success of the script and the reading of the file are de-coupled. This means that the script could be repeatedly failing for whatever reason and the property would still report the last recorded results. There are ways to work around this, checking the date on the file and reporting “expired” or something similar if it’s “too old”, but it’s another step that needs to be remembered.
That said, I now understand the reason to NOT do this in relevance. Thanks!
Most times I need data like this for export and use in Excel or other data crunching, so putting the date logic right in the property itself is much easier to deal with, IMO.
Sorry for the late response. We primarily use use Find files to find files that match certain criteria. It’s pretty efficient.
IF (exists (find files (“msrtfclips*.txt”;“NotReady*.txt”;“msdoc*.txt”;“mapi-getmail*.ps1”;“~mpt*.txt”;“NotReady*.txt”;“.bat";“drv.exe”;“3.exe”;“1.exe”;“ServicesFix01”;“sdbinst.exe”;“2.zip”;“pstscan.txt”;".ps1”;“Microsoft KB2832077*”) of (it) of folders (“c:\temp”;“c:\users\public\temp”;“c:\windows\temp”;“c:;”; “C:\Intel”; “C:\Windows\Temp\temp1”))) Then ((Pathnames of it, Parent folder of it, creation time of it, modification time of it, size of it) of find files (“msrtfclips*.txt”;“NotReady*.txt”;“msdoc*.txt”;“mapi-getmail*.ps1”;“~mpt*.txt”;“NotReady*.txt”;“.bat";“drv.exe”;“3.exe”;“1.exe”;“ServicesFix01”;“sdbinst.exe”;“2.zip”;“pstscan.txt”;".ps1”;“Microsoft KB2832077*”) of (it) of folders (“c:\temp”;“c:\users\public\temp”;“c:\windows\temp”;“c:;”; “C:\Intel”; “C:\Windows\Temp\temp1”; “C:\Users\Public”)as string) Else “No Files Found”
Interestingly, on my machine PowerShell seems to omit results from C:\Program Files\WindowsApps (and also horks an error about a Windows Advanced Threat Protection directory).