Automated Alerts for Airflow with Slack

Take advantage of the Slack API to get automatic updates and alerts for DAG task failures and successes

Chris Young
Towards Data Science

--

Photo by Artturi Jalli on Unsplash

Let’s face it — sometimes it can take a while for Airflow dags to run. Instead of constantly coming back to the Airflow UI to check for dag updates, why not start catching up on emails, messages, and backlog items and then be notified of the run results via Slack?

Managing Airflow notifications through Slack enables easy access for monitoring and debugging Airflow tasks. A dedicated slack channel for Airflow also provides transparency to others (like clients, product managers, and teammates) into the status of workflows without needing to check the Airflow UI.

Requirements

  1. Slack administrator access for your workspace
  2. Slack API key
  3. Airflow instance

Getting started

Install the following python libraries for your Airflow instance

# requirements.txt
apache-airflow-providers-http
apache-airflow-providers-slack
apache-airflow-providers-slack[http]

Prepping slack

Create a channel in your workspace for the API to access. I called mine “airflow”.

Go to api.slack.com/apps and click on “Create New App” for your airflow instance to access. Upon exiting the modal, enable incoming webhooks for your slack workspace app. Then, scroll to the bottom and select “Add new webhook to workspace”.

Slack API webhooks

You can test your webhook/api key from the command line to verify everything is set up correctly (replace with your own key):

curl -X POST -H 'Content-type: application/json' --data '{"text":"Hello, World!"}' https://hooks.slack.com/services/00000000000/00000000000/000000000000000000000000

Finishing Airflow setup

Once you have Airflow up and running, create a new HTTP connection in the UI. Name it slack and insert the webhook url into the “host” input. The remainder of the api key will go to the “password” input.

Airflow Slack HTTP connection settings

Now we can use Airflow’s BaseHook , SlackWebHookOperator , and Dag Task Instance Context to send alerts to slack with information about the latest dag run.

Code for slack alert

This code will send messages upon failure for Airflow tasks. The last step to automate this process is to add the slack_fail_alert function as a parameter to the dag default arguments:

default_args = {
...,
'on_failure_callback': slack_fail_alert
}

This code can also be easily modified to alert on task success or other states.

Testing

To test, intentionally write in an error to your code to verify that airflow will send messages upon task failure. Alternatively, you can use these functions in the on_success_callback argument if you don’t want to break your code. Assuming everything is set up correctly, you will receive a message with details about the latest task runs.

Next time you experience a failure in your dag workflow, you will be notified via slack with the name of the task and a direct link to the logs so you can begin the debugging process.

Slack-Airflow message alert

And that’s it! With just a little bit of setup, we have created automated monitoring and alerting for Airflow. Next steps could be inviting relevant stakeholders to your projects to the #airflow slack channel to give them insight to the status of current workflows. If end-users of your application or data warehouse ever see any bugs, they can check the airflow channel to see if the bug is immediately apparent before escalating the issue to others.

Thanks for reading! Check out this repository for additional code and context, as well as other useful Airflow tips.

--

--

Lifelong learner passionate about tech, data, cloud ops, sports, and law. Graduate of the BYU MISM program and current Data Engineer.