Getting <undefined> on 'hours since last report' analysis

I use this to tell me the number of hours since a client reported in. If its greater then 1, I raise a flag that the device might be offline since our IEM clients normally report in every 15 to 20 minutes. Out of 30 near idenitcal PCs, a few will report , always. If the PCs are so similiar, I don’t get how this can happen. All 30 devices are reporting in properly and about the same time. Thanks for any assistance.

((now - last report time of client) / hour) as string & " hour"

I assume this is in an analysis property, not a web report.

This does not do what you think it does. In order to measure how long since the last report of a client, this must be done in the context of session relevance. (WebReports)

Regular relevance in an analysis property runs in the client context. The client calculates (now - last report time of client), sends the result, and then if it goes offline, it doesn’t send that result again until it comes back. This means the (now - last report time of client) result is never updated. What (now - last report time of client) actually measures from the client context is the amount of time that has past between the last report that was successfully sent and the moment that the relevance is being evaluated. The result of this relevance should always be less than an hour for all clients for all times. In most cases it will be less than 5 minutes.

This is regardless of how long the client may be offline, or powered down, or whatever. When the client comes back, it may evaluate this relevance momentarily and produce a large time interval, but that is only after the client is already back online. This is also unlikely to happen because when a client comes back online, it should send a report very quickly, which is likely before this relevance is evaluated, in which case the time interval calculated will always be very very short.