I have a problem and I’m not sure what the best way to solve it within BigFix’s limitations.
I have a csv that gets deposited on the server that contains location specific information for all our sites. Ownership type, location number, address, etc. The endpoints download this file, locate the line of data that corresponds to them and processes it for use later. They download it daily so if it changes they will only be out of sync for a day.
This works fine, but ideally we want them to only download their specific data instead of the whole file with everyone else’s data that they don’t need. So I wrote a server-side script to split the csv into a separate text file for each location. This would result in a 1KB download instead of ~400KB. And our network bandwidth could certainly use the savings.
The problem I have is on the endpoint actionscript. What I currently have is this, which grabs the entire csv:
begin prefetch block
add nohash prefetch item url=http://127.0.0.1:52311/Uploads/LDB/feed.csv
end prefetch block
But what I need is something like this, which would grab the tiny text file:
parameter "LocationId" = "{some relevance to determine the locationId based on the device name}"
begin prefetch block
add nohash prefetch item url=http://127.0.0.1:52311/Uploads/LDB/LocationInis/{parameter "LocationId"}.ini
end prefetch block
But I quickly found out that you aren’t allowed to use relevance substitution in a nohash prefetch command. However, I need to use nohash because these files are not static. And I believe the download/download now/download as commands would cause a direct download from the server instead of using the relay structure. So that wouldn’t be feasible in my environment where many endpoints cannot talk to the root server.
If anyone has solved a similar use case, I would appreciate any suggestions / ideas.
Something I’ve done is wrap my prefetch in an “if” statement and base the prefetch off of the success of the statement.
if {computer name as lowercase contains "server1"}
begin prefetch block
add nohash prefetch item url=http://127.0.0.1:52311/Uploads/LDB/LocationInis/server1.ini
end prefetch block
elseif {computer name as lowercase contains "server2"}
begin prefetch block
add nohash prefetch item url=http://127.0.0.1:52311/Uploads/LDB/LocationInis/server2.ini
end prefetch block
endif
@jmaple: Thanks, but I don’t think this is feasible since I have ~3000 locations / individual files which with this solution would result in a action script with over 15000 lines.
@strawgate: Can you elaborate a bit more? Would this solution also require having ~3000 hashes in the action script?
You could write a script that parses your CSV and issues mailbox actions to modify the values on the client itself. You could do some logic in to only issue mailbox actions if the client has incorrect values, only replace mailbox actions if they are different, etc. You’ll be looking at 1 mailbox action per set of information you’re looking to distribute (best case 1 mailbox action per site, worst case 1 mailbox action per client).
With mailbox actions, potentially, the client wouldn’t be downloading any content at all other than the fixlet content.
Web Server Option:
To be honest I think the best solution would be to setup a web server (I’d probably just use ASP.Net and IIS) and create an API point that provides this information in XML or JSON or something and then just write a fixlet that has the endpoint gather this information at a certain frequency.
Then use an analysis on the client to read the values out of the XML to populate a property.
This would reduce your bandwidth requirements to way less than <1 kb and you’d be able to put the logic on the web server side (even if the webserver is just pulling the info out of a CSV!) to decide what information to provide to the client.
I do think it’d be interesting to explore leveraging the mailbox feature for this, but hopefully this will at least give you some ideas on one way to approach this with dynamic downloads (i.e. with a manifest): Non-predetermined downloads
@strawgate: That’s a very cool idea regarding the mailbox actions. Interestingly enough we currently have an issue with our environment where some endpoints have issues with mailboxed actions getting blocked, most likely due to a Sophos or Squid Cache issue. But I will file this away for later because I can definitely see it being useful. As far as the web server idea, unfortunately that would involve other departments and would most likely result in a time frame exceeding the amount of man hours I have to complete this task (hint: “as little time as possible”).
@Aram: I read up on the manifest and it does seem to match my use case exactly. I will plan to move forward with that.
@Sean: We have a similar situation is lots of dynamic content that we need machines to be able to download. I’ve solved the problem in three different ways for different systems. We use each method successfully and to its strength.
These solutions assume that central dynamically generated files are hosted on a central web server in a web structure that facilitates location differentiation.
/web/{location ID}/stuff to download
WGET is a very handy way to approach this. Upload wget as a utility (if you don’t already have it on your clients). Simply use relevance substitution in the url you pass to wget including the location ID. There are many parameters and options within wget that you should be able to adapt to your situation.
The web service call (JSON/XML/SOAP) is another way we hit one particular backend database. I just had one of our developers cook up a web service front end similar to what @strawgate is proposing.
The older native Bigfix way to do this is this syntax:
Using ‘download now as’ by-passes the relay hierarchy and does accept relevance substitution.
They left the relevance substitution out of prefetch for security reasons, which I understand. The typical answer of manifests are great for infrequently updated trusted files, but hardly a good solution for thousands of daily dynamic files especially ones outside of the Bigfix ecosystem such as a corporate ERP.
Thanks for the info, Jon. I don’t think I would be allowed to try this with our current infrastructure and network limitations, but it may come in handy in the future.