Trying to find a more efficient way to do the following. Any suggestions would be appreciated;
Currently I have a process that exports every endpoints in our Asset Management Tool out to a single delimited txt file, which contains the hostname and a bunch of other important information like server owner, support groups, etc. Then this single file is compressed and placed on the main bigfix server. I then have a policy TEM action for every client that downloads this compressed file from the main bes, extract it, find itself in it, and set a series of client settings.
The problem is that regardless of if the information for a client changes in the text file or not, each client goes out and downloads that compressed file (350kb) every 24 hours. What I would like to do is have the clients only download it when they need to, which would involve telling the client that a difference has occurred and they need to go out and download the most recent version.
My problem though is I can’t think of a way for my Main BES (or any other client for that matter) to go tell clients that they need to run an action. Can anyone think of a way for a client to tell another client that something is relevant and to go download the new compressed file?
You don’t mention how many systems are involved, but from the size of the compressed file, it’s quite a few.
Normally, TEM caches downloads on the Relays and clients don’t actually download files from their source. I would think this would interfere with the client detecting changes to the information, and to detect a change to its information embedded in a large file, it would have to download and decompress EVERYONE’S data first.
Wouldn’t it be faster, and more efficient for the data sets to be broken into “per HostName” files? One per computer, then the clients can download just their information? Less network traffic overall (even uncompressed), and less effort for the client to locate its information, changed or not.
Currently, my policy download action for the clients uses “download now as”, so it bypasses the relay structure and downloads the compressed file directley from the main bes.
Yes, it would be faster and more efficient to have a single file per endpoints, however, then I have to have a some folder in my \wwwroot\ folder with thousands and thousands of files. Not sure how Windows will like that…
I was thinking of having each client download this info from a third party asset cache, however, then we would have to open another firewall rule and our advertisement to our customers that bigfix only needs 1 port to a host to work properly wouldn’t be true. Any other suggestions? I like the brainstorming
You are right, windows machines get cranky when you put thousands of files in one folder. They’re better than they used to be, but still not worth the pain if you can avoid it.
Where I work, we use naming conventions so that computers start with names that mean something to those that understand the conventions (ie abc-00001, zyx-99999, etc)
Could you break the files into manageable clusters that way? Even if all of one type were in a single ZIP file it should speed the update process since a smaller data set is being pulled down and scanned through.
To really get funky since you’re already bypassing the Relays, write a script for your web server (php, asp, etc) that simply presents the data you want from a databased back end based on the Hostname passed as a variable. Then use a constructed URL in the Download Now as line. Even better if you can have the script directly access the original source for the data.
I like your last suggestion. When you say “web server”, that would be my main bes, correct? If so, then would I need to install IIS, or is there someway I can get the same results but use the builtin BigFix web server application?
Actually, I was thinking of some other www server. I don’t think adding an http server to you BES server is a good idea.
While the BES agents do use the http protocol to communicate with with relays, they use an alternate port number (52311/TCP).
I’m not sure you want the added activity of several thousand clients querying your Main BES server to run a script that will likely need to make its own TCP connections, and database queries.
You would not need a very robust server, but I think it should be a seperate box.
agreed, it would be better if that www server wouldn’t run on the main bes. However, then I have to go around and open a firewall port to that new wwww host for alot of subnets, which I can’t do.