Non-predetermined downloads

Hi
I have a customer that wants to download a number of files using a Fixlet that is scheduled to run later that night. I am happy with the manifest process to provide a list of files to download but in this case the files to be downloaded could change between submission and installation.
Has anyone come up with a solution that can download whatever files are in a specified location at execution time? The best I have come up with is a regularly scheduled task to update the manifest file periodically.
Thanks
David

as a PS, when creating or updating manifests, I use an REST API call to add the manifest file to a custom site rather than use the GUI - is there an easier way?

You could avoid the “BigFix-Native” download method entirely; using “wait” or “waithidden” to download the files as an external call in the Action (using wget, curl, or plain-old “copy” or “cp” if you have the content hosted locally). We have a number of fixlets internally that copy files from NFS servers, for example.

1 Like

I presume by Manifest you mean Dynamic Downloads

See: http://www-01.ibm.com/support/knowledgecenter/#!/SS63NW_9.1.0/com.ibm.tivoli.tem.doc_9.1/Platform/Action/c_introducing_dynamic_downloads.html

This could include downloading the manifest as the task runs

Thanks Jason, I guess this will work if we know what the filenames are or if I can setup an FTP server and do a mget. The environment is all Windows so could in theory use a net use \server\share link but if I recall this doesn’t work from the agent as it’s running as system.
any pointers to example fixlets welcome :sunglasses:

Thanks Alan
Yes, I mean Dynamic Downloads. I can get these to work OK but in this particular case, the list of files to be downloaded needs to be generated at execution time, not submission time.

I’m curious to know a little more about what these files are and what is being accomplished, though I can understand if you don’t want to disclose that in detail.

You could have a script that automatically generates a task with proper prefetches based upon the contents of the folder and the URL that they can be downloaded from.

I didn’t know this until recently, but apparently you can also have the client run an EXE for up to 60 seconds that would dynamically figure out what the prefetches should be or something. I have never done this and find it a bit odd, but it is in the documentation: http://www-01.ibm.com/support/knowledgecenter/SS63NW_9.2.0/com.ibm.tivoli.tem.doc_9.2/Platform/Action/c_execute_prefetch_plug_in.html

Yes, these plugins are used by many of the RedHat type patches, they determine prerequisites and download them in addition to the RPM that you are trying to install. They aren’t always the best thing to use as they stop the entire process of the agent while they run but we have to live with them for this type of thing.

1 Like

I agree, I would recommend avoiding the use of a prefetch plugin if it is not needed.

The documentation mentions a 60 second time limit, is that true?

So if you are saying that you submit an action but only when the action runs are the needed files known, if this is a global thing you can do this like the following (excuse any shorthands on the commands)

         begin prefetch block
         add prefetch item name=manifest.txt sha1=X size=X url=http://site.com:52311//manifest.txt sha256=X
         collect prefetch items
         add prefetch item {concatenation " ; " of lines of download file "manifest.txt"}
         end prefetch block

If you are saying that the endpoint needs to determine something else then its a lot more complicated and a plugin can solve the issue but you can run a lot of relevance in a prefetch block to help you determine what is needed

The BigFix-Proper way is to generate a Manifest file, and use the contents of that file to determine your dynamic downloads. I assume fr9m your previous posts that you are familiar with that, including getting the manifest updated, then delivering the manifest to clients (either by updating a fixlet to deliver the manifest to specific clients, or by attaching it as a site file).

Using a Manifest, you can take advantage of the relay caching, download throttling, etc.

If you really want to go outside the BES download architecture, you could either set up IIS to share your files over HTTP (anonymously) and have the client use wget, curl, webdav, or something similar to download it from your site; or you could host it over SMB file shares and grant the Computer accounts access to the file.

If.you want to use SMB, on your file server you’d need to grant the “Domain Computers” and “Domain Controllers” groups, or a custom computers group of your choice, permissions to the files both through NTFS permissions and File Share permissions, as well as the “Access this computer from the Network” right in the system’s Security Policy Editor or Group Policy. These computers may not be members of the “Users”, “Authenticated Users”, or even the “Everyone” groups, so I usually like to assign these permissuons on computer groups directly. (And of course this only works in a Domain environment. If you aren’t using a Domain, then you can look up NullSessionShares. This is so abhorrent that I will discuss no further, except to say if you go that route to bring plenty of crosses and Holy Water, and May God Have Mercy On Your Soul.)

I used this type of configuration to grant computer (LocalSystem) accounts access to a file share during RIS, WDS, and MDT OS installations before we started using BES and TPMfOSD. This is very much not the BigFix way of doing things, but …every tool has its place.

Thanks again to everyone for their comments, I will investigate the exact requirements and will report back to close the loop.
If the requirement truly is dynamic then I may have to go with a out-of-band solution and download outside the relay architecture, if I can pre-determine the file list then I’ll use a manifest.
Cheers
David

As an exercise I installed IIS on my IEM server, created and FTP site, allowed anonymous access and created a simple Task to test the principal - seems to work fine if you accept that you need direct FTP access from clients to server and that there is no relay caching, checksums etc.

action parameter query “SourceFolder” with description “Please enter the source directory (under C:\inetpub\ftproot on IEM server):”
action parameter query “TargetFolder” with description “Please enter the full target directory:”

if {not exists folder (parameter “TargetFolder” of action)}
dos mkdir {parameter “TargetFolder”}
endif

createfile until THEEND
bin
lcd {parameter “TargetFolder”}
cd {parameter “SourceFolder”}
ls
mget *
bye
THEEND
delete ftp.script
copy __createfile ftp.script

createfile until THEEND
ftp -A -i -s:ftp.script iemhostname >> c:\temp\ftpget.log
THEEND
delete ftpget.bat
copy __createfile ftpget.bat
wait ftpget.bat

If you are going to do things out of band, then there is no reason to do this on the IEM root server itself. I would recommend using a different server for the out of band use in production.