Running in an air gapped environment but I cannot seem to find a download location for the BesDownloadCacher - I have what I presume is quite an old version of the tool (5.8.1.0) and found a download location for 2020 (5.8.6.0) but wondered where I can find one any newer - I checked the normal download location but nothing obvious.
Secondary to this, I want to use the tool to just cache file I need rather than everything in the default sites, I undertand that I can use -z in the command and create a custom site with only that patches I need to download but how do I get this to identify the correct site and how does the tool know what that site is?
And the associated documentation with quite a lot of options, including those that allow you to control which sites and files get downloaded: Non-extraction usage
I’d be interested to understand at which point you’re seeing that error in the non-extraction method air-gap process, but either way, I’d suggest opening a support case to work through that error as it is unexpected.
Thanks again for the reply.
I’ll open a ticket then to understand the issues - it’s basically after a period of time if seems to almost get to a stage where it feels like it’s done a fair chunk then just stops
Just as a follow-up I have raised a ticket with support and got around the initial issues with the non-extraction method (the execution terminated issue).
It still doesn’t work using file lists to download only the files we need (which is frustrating - it lists 14k files and downloads only 2k) support are still working this.
Cheers for the reply - the first post I’ve tried removing everything over 2GB (so well under the limitation) and that stabilised it the issue now is that it doesn’t want to download the right files - it stops after ~2.1k files of 14.4k. Support are diligently working through the issue, it strikes me odd - others surely are doing this already and not having these issues surely?
It’s strange as our situation is far from unique - pretty much anyone air-gapped is/should be doing something similar, whether they filter the files to download just a small subset I don’t know.
thanks for the reply, yes I’d got to the bottom of the 4GB file issue - the issue that isn’t resolving is the downloads sop after 2k files and ignore the other 12k files to download,
Still working with support to resolve this and they’ve not seen this before.
What I have done (as a temporary workaround) is created a powershell script that pulls the URL’s from the produced file list and downloads them for me, however it would be good to get the correct method solved.