Trying to pull the number of entries shown in 50 log files

So I have a set of 50 log files that are all setup as log.out then log.out.1 then log.out.2 etc… until it gets to 50. They reside in the /computer/logs folder on linux computers. I am trying to pull out the number of times I see an OutofMemory error in the log as an analysis but I don’t know how to pull out of all of the logs at once instead of just viewing one at a time.

Here’s what I have:
number of lines containing “OutOfMemory” of file “/computer/logs/log.out”

So that will return from the one file but if I try to add a wildcard at the end it doesn’t work. I have also tried to do some regular expressions and that doesn’t seem to work either. Any assistance would be greatly appreciated.

Not at a computer now so I can’t test, but I’d try

sum of numbers of lines containing "OutOfMemory" of find files "log.out*" of folders  "/computer/logs"
1 Like

Could also try this, again not in a position to test it just now

sum of lines whose (it contains "OutOfMemory") of files whose (name of it contains ".out.") of folder "/computer/logs/"

1 Like

I appreciate the responses but both options are just giving me undefined results back when I run the analysis across all computers.

Jason was close - too much plural relevance, which is an unusual state

sum of number of lines containing "OutOfMemory" of find files "log.out*" of folders "/computer/logs"

3 Likes

had the same issue, and above mentioned doesn’t work for me

Please use either /opt/BESClient/bin/QNA or the WebUI Query app to run these queries and post the results back here so we can see where the issue is

Exists folders "/computer/logs"

number of find files "log.out*" of folders "/computer/logs"

number of lines containing "OutOfMemory" of find files "log.out*" of folders "/computer/logs"

sum of number of lines containing "OutOfMemory" of find files "log.out*" of folders "/computer/logs"

Thanks trn that ended up working for me.

1 Like