I’m trying to get the TEM client to try and download a page from a website, so I can confirm that there are no ACL issues in the environment when the client connects to the URL mentioned below. The problem is that port 80 is blocked from the client to the target URL, so I have no idea how “download as” is being successful. Why does “download as” succeed (when it shouldn’t) and “download now as” failed, as expected?
“download as” works the same as our standard download command, except that it lets you easily rename it.
When an endpoint requests a download in BigFix, it doesn’t go to the url and download the file itself. Rather, it trys to download it from its parent relay. If the relay has it cached, it will just deliver the file, other wise it will ask its parent, all the way up to the server. The server is the only one who actually goes out onto the web and downloads the file, and this will really only happen once, regardless of how many requests are made (because of all the caching). This happens on the same port as normal BigFix traffic, so you likely already have that open.
The “download now” command is the one exception to this. Each client will actually attempt to go out to the web and download the file directly, which is why it fails in your environment.
You should avoid using the download now command. It doesn’t all the same optimizations that all other downloads use in BigFix, and can potentially cause you problems.
Yup, understand that Ben. Maybe some backround will help. I have a fixlet that does a wget call to server123. This works for 95% of my clients. It doesn’t work on the remaining 5% because an ACL is blocking port 80 from the client to server123.
So what I’m trying to do is make a simple fixlet that uses the “download now as” command to attempt to download the home page on http://server123. I will then push the fixlet out to all my clients, and where it fails, I know that there are ACL issues blocking port 80.
what’s I’m having issues with it why “download now as” fails on one of the 95% clients, where the wget command to server123 works. I also just tried replacing the wget call with “download now as”, and it worked, but for some reason my “ACL test” fixlet continues to fail.
We also use a combination of wget and Bigfix for ACL tests. In fact, many custom fixlets that we write for configuring web service calls are pre-pended with a wget ACL test.
On the specific machine that should work, but doesn’t, can you telnet from the client to the server on the desired port? I know in our shop, sometimes what was open yesterday isn’t open today due to any number of reasons.
There are a few tips with wget. I like to use the re-tries option (–tries=3) to avoid a communications hiccup skewing my results. In addition to downloading the web page, use the ‘-a’ option to write out a log file of the communication attempt. Many times the log is more valuable than the web page. I like to parse it for status info such as ‘connected’ or ‘timed-out’ or ‘failed to resolve’. I also noticed that sometimes attempting to parse either the web page or log file immediately yields a race condition, so I found inserting a ‘wait’ for a few seconds eliminates that possibility (Ex: dos ping localhost >nul). When querying for the downloaded web page, I find it helpful to make sure its size > 0. You may be thinking some of your other downloads worked with wget, when in fact they failed with a zero byte web page.
Example:
delete __appendfile
appendfile @echo off
appendfile c:\temp\wget\wget.exe --tries=3 http://<web_server>:/<web_page.html> -a c:\temp\wget.log -O c:\temp\web_page.html
appendfile ping localhost >nul
copy __appendfile c:\temp\ACL_test.bat
waithidden c:\temp\ACL_test.bat
continue if {exists file “c:\temp\wget.log” whose (content of it contains “connected”)}
continue if {exists file “c:\temp\web_page.html” whose (size of it > 0)}
When testing a whole set of ACLs, I skip the ‘continue if’ statements, but then make an analysis of the log file results. It is helpful to make the name of each log file the name of the target host.
Gu JonLandis, so what we’re trying to do sounds very similar. I’ve always used wget to issue a command, but just recently decided to also use wget to test ACLs before I issue the wget command. However, I wanted to switch my command from wget to the built-in “download” commands that bigfix support, so I don’t have to deploy wget to all our windows clients.
However, I still have the problem that the wget command works, “download now as” fails. It appears that an http 302 error may be the cause, but then why does wget work?
BigFix:
Command failed (Can’t download file 'General transport failure.
Does the web page downloaded by wget contain good data and is non-zero? Any proxy servers that may be interfering?
Why not continue to use wget? Either include in your image or policy-action it out to all your machines. I have a bare-bones set of wget files deployed to thousands of machines in my environment expressly to do things like ACL checking.
Side-note: I’ve found it is also useful for inventorying printers with built-in web servers (wget status pages to a consistent place per location, then make an analysis of their contents).