Question on Archive Now

We’re using the Archive Now feature to collect logs from our clients for troubleshooting purpose. This is working fine, however noticing that files > 19-20MB are not getting processed and file(s) are not copied to the individual folder. Is there somewhere settings that needs to be set for these file(s) to be processed correctly.

these are file(s) when I check the folder on the Root Server \BigFix Enterprise\BES Server\UploadManagerData\BufferDir\Temp

_152b4c718845f71bc99ae16954ec9e6872c6047e 5/2/2022 25,499KB
_81462e58bb7092a3b1c52ff6ecfbab9209a6cd29 5/4/2022 39,024KB
The last one is from test workstation and it’s been forwarded from Client - local relay - regional relay - Root Server

Client Logs
At 15:42:49 +0200 -
[ThreadTime:15:42:49] Starting upload of file ‘8b55128128ed805d5ef196d9221c2947879c481832bed4ab3ff8155fb224b740’ of size 39960059
At 15:43:41 +0200 -
[ThreadTime:15:43:40] Successfully uploaded file ‘C:\Program Files (x86)\BigFix Enterprise\BES Client__BESData__Global\Upload\sha256\8b55128128ed805d5ef196d9221c2947879c481832bed4ab3ff8155fb224b740’

Any ideas, suggestions are welcome.

Do you set _BESClient_ArchiveManager_MaxArchiveSize at all?

It might be in your script or it could already be set on the clients.

That was the initial issue and then the file did not even get uploaded to the root server. I’ve changed the setting and I can see the file being uploaded, but it’s stays in that Temp directory, I can see smaller files for few seconds also in that directory but they disappear right away and are processed correctly.

I found setting that might explain this behavior when reading the documentation
_BESRelay_UploadManager_CompressedFileMaxSize (Default: 20MB)

I will increase this value and try again.

I can confirm issue is resolved by increasing the value of setting
_BESRelay_UploadManager_CompressedFileMaxSize

Watch Out?
This setting will only be applicable for any new file that is being uploaded, existing file(s) will not be processed anymore.