Create an alert when a log entry has not been seen in X minutes/hours

I’m trying to create an alert that notifies me if my recurring background tasks have not run in an expected period of time.

All of the tasks publish logs, so how would I create an alert condition for a log entry that hasn’t been seen in X minutes/hours?

Hi @jared3 - A loss of signal alert should fit your request.

Thanks @stefan_garnham , although I’m still having trouble envisioning how I should set this up. Right now in my alert policies I have the NRQL query set to something like this:

SELECT count(*) FROM Log WHERE ...

Which returns logs that indicate that my recurring tasks ran.

In the thresholds section, I have this:

image

Expanding the “lost signal” section doesn’t give me many clues on how I would proceed:

image

Just to be clear, I want the alert to fire if certain log statements returned by the NRQL query above aren’t seen for X minutes/hours.

Thanks!

Just wanted to bump this to see if I can get some clarity on how to configure this alert. Thanks

Hello! See more here on setting your Loss of Signal threshold:
https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/alert-conditions/create-nrql-alert-conditions/#signal-loss

This is especially important in a NRQL condition that uses count() and needs a 0 to open or close an incident. Due to the way order of operations are evaluated for streaming alerts, aggregators like count() and uniqueCount() will never return a zero value. When there is a count of 0, the SELECT clause is ignored and no data is returned, resulting in a value of NULL. In this case, your evaluation needs to hit 0 to open, so you will want to configure to Open new "lost signal" violation, and set the duration equal to the threshold for the condition.

1 Like

That worked! Thank you for your help.