Alert policy no longer sending any notifications

With one of our alert policy, it seems that we are no longer receiving any notifications (Email, Webhooks) despite it showing it is able to detect it:

The link to the policy is here https://onenr.io/0LZQWDMzkwW. Note, the events to trigger this doesn’t occur often, so you will not see it often. However, if you query our logs for the last 7 days you will see the occurrences; https://onenr.io/0M8jqEDO1jl

The last successful email notification we received was on Jan 2nd.

Thanks,
Ron

Hello @ron.santos it looks like you also opened a support case for this issue. I will post the reply here as well.

I noticed a few things with your alert policy and have some documents here that will help resolve this issue for you. In Short The Event flow needing two data points and the count(*) function will both need to happen here. Looking at the data points there is a steady stream of 0 data which the count function will count as NULL which is why you are not getting alerts due to the fact that the alert needs a second datapoint to close out as reading 0 counts as Null. outlined below.

Event Flow needs 2 data points before it will aggregate the data. If the the data is to infrequent it will prevent an incident from opening up. After 30 minutes, if a second point of data has not been received, the first point of data falls off into a ‘stale’ data category and is no longer considered. What this means is that if your data is not coming in pretty consistently or has large window of time where it may not send anything to New Relic then you want to use timer.

Choose your aggregation method

New aggregation methods for NRQL alert conditions

Relic Solution: How Can I Figure Out Which Aggregation Method To Use?