it seems be get hung up when getting the file though. we see the following download error:
Download error: "URLInfo: Attempt to use missing Scheme."
Download requested on server:
URL: {parameter
Hash: (sha1)7013596a9ecd07cbb451854b19094d14d667d569
Size:
Next retry: The download will be retried the next time it is requested. Retry now
it works in our test environment, which is on 9.1, but we get the error when running it in our prod, which is on 9.2. Any idea on what may cause a message like this?
I haven’t seen that specifically before. You might have better luck using a Prefetch Block (to ensure the parameter is actually getting evaluated during the prefetch processing). You might also get different results depending on whether your actionis set to “begin downloads before constraints are met”.
If it works in 9.1 but not 9.2, you should probably do a PMR.
It seems like you are doing this because you have multiple root servers that you are testing actions against.
Like @JasonWalker suggests, you can just use localhost if the file will always be hosted on the root server that you are going to run the action on.
But, I think a better option is to have all of your files hosted in a repository that you expose with HTTP/HTTPS to all of your root servers and host the files there. Then the URL will always be https://mysoftwarerepo.organization.tld/whatever
This means the source of your internal files will not be on the root server at all which reduces the amount of storage you need on your root servers significantly. I still would recommend your root servers have a relatively large web cache, but it won’t be where your source of truth is, and you won’t have to worry about backing up the cache or syncing it and any conflicts that might cause.
hey @jgstew I like the idea of hosting a repos on another server, but IBM said that was frowned upon… have you ever ran into any issues doing it that way??
Hosting your internal files on your own repo actually behaves identically to how vendor downloads are handled. IBM doesn’t keep a copy of every Microsoft patch - the fixlet instructs the server to download these from http://microsoft.com/whatever. As long as you make your internal repository available to the BES server via http/https it works quite well.
hey @JasonWalker, ya totally agree, many moons ago we were told to host all downloads on the bigfix server, not on any other servers, but if that is no longer the case we will definitely keep our option open now. Thank you again for all your help on this.