0

facing some issue with notification when data is missing. We receive metrics every 1 hour. We have set a monitor to notify if no metrics received for 70 minutes. But even if we fail to send the metrics, no notification is sent to us after 70 minutes. Why might this be happening?

{
    "id": "<ID>",
    "name": "Test Monitor",
    "type": "metric alert",
    "query": "sum(last_1m):count:jobs.operation_time{*} by {job_type}.as_count() > 20  (this is an impossible condition)",
    "message": "{{#is_no_data}}\nNo data received\n{{/is_no_data}}@[email protected]",
    "tags": [
        "tag1"
    ],
    "options": {
        "notify_audit": false,
        "locked": false,
        "timeout_h": 0,
        "silenced": {},
        "include_tags": true,
        "no_data_timeframe": 70,
        "require_full_window": true,
        "new_host_delay": 300,
        "notify_no_data": true,
        "renotify_interval": 0,
        "escalation_message": "",
        "thresholds": {
            "critical": 20
        }
    },
    "priority": 1,
    "classification": "metric"
}
3
  • Could you try set require_full_window at false? On the UI, this parameter is recommended when the metric is sparsed. Commented Aug 9, 2021 at 8:13
  • Already tried it Commented Aug 10, 2021 at 7:43
  • The primary reason is that the monitor is never going into no data state even if metric is not getting received for hours. I don't understand why it remains in ok state. Commented Aug 10, 2021 at 12:53

1 Answer 1

0

Turns out that it was an issue from datadog side. Its been fixed now. https://status.datadoghq.com/incidents/jymbdkfx0y0h

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.