Generating client logs then collecting the logs in a usable way

I have a task that runs a powershell script on certain systems. The script generates a csv in a specific directory on the systems. I have been using an analyses which reads the lines of the csv that I then view in a webReport. While this has worked, exporting the report as a CSV is troublesome as it’s technically my CSV inside of the webReport CSV. I can manipulate the exported CSV so that it’s usable but that takes time.

Lately I’ve been using the archive function to get my CSVs back to the BES. This works, but now what? Browsing to BigFix Enterprise\BES Server\UploadManagerData\BufferDir\sha1 then selecting each randomized folder name to grab 200+ CSVs is also clunky.

Can anyone suggest a better way to do what I’m trying to do?

Thanks!

What do these CSVs contain?

Also, you could probably get the contents from a property using the REST API.

The folder names are not randomized, they are based upon the Computer ID of the computers in question. You could automatically grab the CSVs from those folders if they have a consistent name and you know which computers are going to be uploading the CSVs.

What do these CSVs contain?

Detailed local account information. Stuff like account name, display name, enabled\disabled, last logon, and through true or false detecting whether the user is a member of either the local administrator or remote desktop users groups. I know powershell so I write powershell. I don’t doubt there’s a way do this with an action script, but I’m not too action script savvy.

Can you elaborate on the REST API. I’ve wrapped Powershell around REST only a few times and have found @jpsthecelt examples here. Anything else to guide me in the right direction?

Thanks @jgstew, as always you’ve been a lot of help!

Actionscript is not what I was getting at. Actionscript can be used to run anything from the command line, including PowerShell… There is nothing wrong with this approach.

However, most or all of this info can be had with Relevance directly without the need to run anything to collect it periodically. You can create an analysis, add properties with the relevance to get this info, and even set the report period for the properties. I would recommend once per hour unless the relevance is slow. You can then export the results to CSV using many different methods, or you could use the REST API to take the results and process them automatically further if you have a process that you need to do with it in reaction to the information.

In general, anything that can be collected by relevance directly instead of running a periodic action is better done through relevance.

I don’t believe there is currently a single analysis that has all of the info you are looking for, but there are relevance examples scattered throughout BigFix.Me and the forums.

Start here:

I also have one I’ve been working on that I’m not sure I ever posted anywhere and there are definitely some individual relevance pieces that are related to this.

If you could post some or all of your powershell here, that would be useful as well to know what you are looking for so I can better tell what relevance would do the same. Just copy and paste it into a forum post and save it without modifications. The forum may screw up the text due to it attempting to apply formatting, but I can fix it after the fact for you.