Looking for file in mapped drive runas action script or relevance questions

Is there a method to inspect currently logged in user’s mapped drive for a specific file? looking to see if this is at all possible in bigfix. Other solutions are being considered, but trying to leverage this tool if we can.

i.e. we’re looking for a file that would exist in a specific folder when users who have that folder are logged in.

I dont think this could be accomplished as relevance, but is there an inspector for it? Other than re-running an action every X minutes is there a more efficient method to discover this?

Looking for:
windows systems only
domain\joeschmo
Z:\Thisfolder\Badfilehere.xyz

System does not have access to the Z:\ drive and depending on user the unc path may vary wildly due to nesting of groups and storage locations.

Current thoughts are to leverage runas currentuser with a script to write a file or reg key that can be picked up in standard relevance to capture historical presence of this file.

Would this be better done using Windows auditing on the box that hosts the share & folder in question?

In my opinion that would be the way to go… But it gets tricky as storage share provider(s) are legion and distributed across a large swath of responsibility and access. We have one common unifier and that is bigfix on all client endpoints. Which is why I was thinking about how to accomplish with bigfix.

The documentation about the audit log is available here Server Audit Log .
I would suggest to submitting this to our Ideas Portal to gauge interest and the potential inclusion in product.

1 Like

Another way that may be more consistently successful than runas currentuser would be to create an AD service account with appropriate permissions to the targeted share and file. In a Bigfix job, create a scheduled task containing a batch or powershell script that would map the drive, look for the remote file, then write a log locally. The key is to create the scheduled task in the context of the service account. I recommend creating an xml file for the task definition.

Once the task has run as the service account (which can happen regardless of interactive user logon), then you can have an analysis scoop up the results of the log your script created.

We like this approach so much that we have a generic service account permanently assigned and a ‘shell’ job with the task, xml, and powershell already there that is just waiting for us to plug in a value to execute.

The applicable xml section is:

<Principal id="Author">
  <UserId>domain\user</UserId>
  <LogonType>Password</LogonType>
  <RunLevel>HighestAvailable</RunLevel>
</Principal>

Then your actionscript would include something like this:

schtasks /create /xml path\task.xml /RU domain\user /RP pipe_in_password_from_secure_parameter_or_location /TN task_name

1 Like