"download as" vs "download now as"

(imported topic written by cstoneba)

I’m trying to get the TEM client to try and download a page from a website, so I can confirm that there are no ACL issues in the environment when the client connects to the URL mentioned below. The problem is that port 80 is blocked from the client to the target URL, so I have no idea how “download as” is being successful. Why does “download as” succeed (when it shouldn’t) and “download now as” failed, as expected?

Completed download as “page.txt” http://url1234.net

Failed download now as “page.txt” http://url1234.net

(imported comment written by Zakkus)

“download as” works the same as our standard download command, except that it lets you easily rename it.

When an endpoint requests a download in BigFix, it doesn’t go to the url and download the file itself. Rather, it trys to download it from its parent relay. If the relay has it cached, it will just deliver the file, other wise it will ask its parent, all the way up to the server. The server is the only one who actually goes out onto the web and downloads the file, and this will really only happen once, regardless of how many requests are made (because of all the caching). This happens on the same port as normal BigFix traffic, so you likely already have that open.

The “download now” command is the one exception to this. Each client will actually attempt to go out to the web and download the file directly, which is why it fails in your environment.

You should avoid using the download now command. It doesn’t all the same optimizations that all other downloads use in BigFix, and can potentially cause you problems.

-Zak

(imported comment written by cstoneba)

Yup, understand that Ben. Maybe some backround will help. I have a fixlet that does a wget call to server123. This works for 95% of my clients. It doesn’t work on the remaining 5% because an ACL is blocking port 80 from the client to server123.

So what I’m trying to do is make a simple fixlet that uses the “download now as” command to attempt to download the home page on http://server123. I will then push the fixlet out to all my clients, and where it fails, I know that there are ACL issues blocking port 80.

what’s I’m having issues with it why “download now as” fails on one of the 95% clients, where the wget command to server123 works. I also just tried replacing the wget call with “download now as”, and it worked, but for some reason my “ACL test” fixlet continues to fail.

~ACL Test Fixlet (http 302 error in client log)

download now as “page.txt” “http://server123.net

continue if {exists file “page.txt” of folder “__Download”}

~Wget/Download now as (works)

download now as “mm.txt” “http://server123/adhoc.php?servers[]={computer name}”

continue if {exists file whose (name of it = “mm.txt” AND line 1 of it = “Server(s) updated: 1”) of folder “__Download”}

(imported comment written by SystemAdmin)

We also use a combination of wget and Bigfix for ACL tests. In fact, many custom fixlets that we write for configuring web service calls are pre-pended with a wget ACL test.

On the specific machine that should work, but doesn’t, can you telnet from the client to the server on the desired port? I know in our shop, sometimes what was open yesterday isn’t open today due to any number of reasons.

There are a few tips with wget. I like to use the re-tries option (–tries=3) to avoid a communications hiccup skewing my results. In addition to downloading the web page, use the ‘-a’ option to write out a log file of the communication attempt. Many times the log is more valuable than the web page. I like to parse it for status info such as ‘connected’ or ‘timed-out’ or ‘failed to resolve’. I also noticed that sometimes attempting to parse either the web page or log file immediately yields a race condition, so I found inserting a ‘wait’ for a few seconds eliminates that possibility (Ex: dos ping localhost >nul). When querying for the downloaded web page, I find it helpful to make sure its size > 0. You may be thinking some of your other downloads worked with wget, when in fact they failed with a zero byte web page.

Example:

delete __appendfile

appendfile @echo off

appendfile c:\temp\wget\wget.exe --tries=3 http://<web_server>:/<web_page.html> -a c:\temp\wget.log -O c:\temp\web_page.html

appendfile ping localhost >nul

copy __appendfile c:\temp\ACL_test.bat

waithidden c:\temp\ACL_test.bat

continue if {exists file “c:\temp\wget.log” whose (content of it contains “connected”)}

continue if {exists file “c:\temp\web_page.html” whose (size of it > 0)}

When testing a whole set of ACLs, I skip the ‘continue if’ statements, but then make an analysis of the log file results. It is helpful to make the name of each log file the name of the target host.

(imported comment written by cstoneba)

Gu JonLandis, so what we’re trying to do sounds very similar. I’ve always used wget to issue a command, but just recently decided to also use wget to test ACLs before I issue the wget command. However, I wanted to switch my command from wget to the built-in “download” commands that bigfix support, so I don’t have to deploy wget to all our windows clients.

However, I still have the problem that the wget command works, “download now as” fails. It appears that an http 302 error may be the cause, but then why does wget work?

BigFix:

Command failed (Can’t download file 'General transport failure.

http://server123.net/index.php’ http failure code 302’) download now as page.txt http://server123.net/index.php

WGET:

wget.exe --output-document=page.txt http://server123.net

11:22:10

http://server123…net/

=> `page.txt’

Resolving server123.net… 1.2.3.4 Connecting to server123.net|1.2.3.4|:80… connected.

HTTP request sent, awaiting response… 302 Found

Location: httpserver123.net

/index.php?Session=076025769508e3a

a9cd66f94c078b0d8

following

11:22:10

http://server123.net/index.php?Session=07602576950

8e3aa9cd66f94c078b0d8

=> `page.txt’

Connecting to server123.net|1.2.3.4|:80… connected.

HTTP request sent, awaiting response… 200 OK

Length: unspecified

text/html

<=>

2,574 --.–K/s

11:22:10 (8.76 MB/s) - `page.txt’ saved

2574

(imported comment written by SystemAdmin)

Does the web page downloaded by wget contain good data and is non-zero? Any proxy servers that may be interfering?

Why not continue to use wget? Either include in your image or policy-action it out to all your machines. I have a bare-bones set of wget files deployed to thousands of machines in my environment expressly to do things like ACL checking.

Side-note: I’ve found it is also useful for inventorying printers with built-in web servers (wget status pages to a consistent place per location, then make an analysis of their contents).