As part of an analysis, I’m attempting to parse BES Client log files. Inactive logs from previous days can be easily parsed. Trying to parse the active log yields “class FileIOError”. Is there any reason that a BES Client can’t parse its own active log, especially since it owns the locks on the log? Are there any tricks to recursing the active log?
We have frequent problems with partial agent corruption. I’m parsing looking for custom site signature errors, gather errors, unhandled exceptions, and corrupt custom site data. It basically extends the health check concept to include these additional criteria.
Is there a solution to this FileIOError problem? I do have a similar problem, but in my case it’s not the BESClient log, but a log file of another application. The BES analyses inspectors can access the file and find “last modification date” and “file sizes”, but any attempt to get the contents of the file (“line 1 of file …”) result in a FileIOError. The file sits on a Windows2003 Server and has currently a size of 10MByte.
I assume the reason is, that the BESClient tries to open the file in something other then “READONLY” which of course is prohibited, since the running application needs exclusive write access. I do have other applications on this machine (i.e. notepad) which can open and read this log file.
How could the relevance be expressed to find some lines containing error messages?
I believe this error will occur if the BES Client tries to open a file that is currently opened with a write lock. The BigFix Agent will try to open it READONLY, but it will give up if the file is already opened for writing. I believe that notepad and other applications will allow you to open a file that is already opened for write, but the BigFix Agent is not that adventurous (to avoid causing issues).
How should a user handle this? I’d like to (in some manner) upload data from an active (locked) logfile. I can’t read a locked file with relevance, so I imagine that I would need a task to copy the file off to a temp directory so that I can read that. If I copy the entire logfile over each time, will I get duplicate results in my analysis from the repeated upload of the full file. Can I just set a task to run every hour to copy the active logfile to a temp directory and set an analyses to pull the lines from the file?