By default, the root server stores Uploaded payloads to .\BigFix Enterprise\BES Server\wwwrootbes\Uploads
We have multiple IEM environments so when we add content to this directory on RootServer1, we have to copy it to the other root servers, Would it be possible to have the root servers change their default storage location for that folder to something outside of the OS the root system is on?
We would then modify the download URL in our fixlets to hit this HTTP accessible NAS,and the same URL could be used across all our root/DR IEM servers.
There is no reason to change the download folder to accomplish your goals and I wouldn’t recommend doing so to achieve what you are looking to do.
You just upload your custom content to a webserver that all of the root servers can access, then use that URL in the prefetch command. All the root servers will download that content from that webserver into their webcache and keep it there until it the webcache is full and then the oldest / least used stuff will be deleted to allow new stuff to make it’s way into the cache. You should set the root server’s cache to be large enough to prevent it from deleting things too quickly.
This is how I generally handle things, even though I’m not doing it for the reasons you are. The main reason is that if I do it this way, then the root server will manage its cache automatically. If I manually upload things to the root server, then they sit there forever until they are pruned manually.
Also, if your root servers can reach each other, then you should only have to upload the file manually to one of them, use that URL in the prefetch, and the other root servers will just get it automatically from the root server you uploaded it to. It seems like the problem you are trying to solve would be a non-issue if all of the root servers could reach each other for the download.
The only reason I would recommend changing the download folder is to put it on a different volume on the same root server so that it can be on slower storage while the root server FillDB and other items are on faster storage.
If for some reason you really need to keep things in sync between multiple root servers, then you should run robocopy or something similar to copy the data from one root servers download folder to another. I would most definitely not have the actual download folder on remote network storage. This will add complexity and potentially cause issues.
My only complaint is that when I run the Software Distribution wizard, it automatically does the work of uploading the payload in the Upload directory on the root server, and pumps in the URL into the prefetch command of the newly created fixlet. It would be nice if it could put the payload to the desired destination and pump in a custom URL into the prefetch command, otherwise I need to do those things manually, and it’s prone to errors.
I would need to create some rsync job to keep the Upload directory on all our root servers in sync with the new HTTP accessible Upload directory, and manually update my new fixlets with the new URL.
It should use a URL that is the absolute location to the download on the root server you did it on. This should be accessible on any root server that has access to the root server you did it on.
If you automatically generate content, you can do all of this as a part of an automated workflow.
I prefer to copy existing relevance and actionscript and customize it for the first time I’m doing something, then automate it after that as much as possible. I find the relevance that is automatically generated by most of the IBM things to be quite overly complicated and poorly constructed, which is why I generally avoid these wizards.
You can create a template and use the Fixlet Maker Dashboard. It definitely has some quirks, but you can pre-define the relevance for a particular template while having some parts be dynamic.
thanks for the input. I decided to make an rsync process keep all the Upload directories in sync, which will solve that problem. Also, I tested it and it works when I use localhost in the prefetch URL, so now I can just use http://localhost:52311/Uploads/... in the prefect command across all my IEM environments and no longer need a custom URL in each fixlet.
Yes, that method works as long as the upload folders are synced. Sorry, I didn’t realize what you meant by custom url otherwise I would have mentioned that.
The secret is that it is only the root server that needs to be aware of how to get the file, so if it is a file on the root, then localhost works just fine.