Web Reports Out Of Memory

(imported topic written by cj6591)

We are trying to run a report via SOAP to gather all of the installed apps on each client machine.

This used to work and then we started getting an out of memory error. (not sure what changed, the client count is still around the same).

We can get it to run if we segment the clients and pull them in small(er) batches but this is extremely innefficient.

Has any one else run into this problem?

Can any suggest what I should look at?

Web reports is running on the same machine as the root server.

There’s plenty of memory in the machine.

Web reports is not hitting the max for allocable memory.

To give you a little background on the process …

We are doing the call from a Perl program.

We capture the results and then parse it and produce reports on the Unix box that the Perl app is running on.

Any help is appreciated.

(imported comment written by SystemAdmin)

I’ve run into the same problem when running large SOAP queries and normally breaking the query up into smaller pieces works fairly well (and isn’t an overly large overhead from doing a single request).

But if you are having trouble, then an alternative method you might want to consider (and potentially a process you can use any time you are thinking of collecting large amounts of data from each machine) is to use the Upload Manager. As you are already using Perl to collect and parse the data, then you should be fairly comfortable with just having to pull your data files from a different location and go from there.

The Upload Manager archiving process will take a file you create on the client and uploads it to a specific folder on the server. As long as you can get access to some point of this file path*, you will just need to walk the directory tree and pick up the files that match the names that you created on the client (although the names won’t be exactly the same, but you can easily figure out which ones they are).

If you get rid of the analysis (or more worringly, the RP) that you are using to collect the data and just put that relevance in the file creation task, you will also reduce the memory load on your console.

-Jim

*(No matter where I am working, it always seems to be much more difficult to get access to where I need to get the data from, compared to the effort required to write the script to import, parse and spit it back out.)

(imported comment written by cj6591)

Thanks Jim!

I am already doing that for some stuff and actually have a task created that gathers archive results and merges them together into one results file bsed on the action ID.

What we are currently trying to do is just query the BF data and pull machine and application information and dump it to a file to parse. In our organization this is a HUGE amount of data. Right now the SOAP call is the only way we have of getting the data out of the system.

we have already broken the process down into about 50 individual requests but are running into out of memory issues AGAIN.

We need another solution to use SOAP and pull the data that we need from the report server.