Airflow server not working

My airflow server setup doesn't run tasks, not even exemplary ones. Whenever I do a manual start, an object is created DagRun

for which the status is executed, but it always stays the same. This issue is related to all dags, not just one specific dag.

Whenever I run dag, I see it appears in the scheduler log, but nothing appears in the celery log.

I can run tasks inside dag using airflow test

so that the command airflow trigger

or manual trigger doesn't work.

I ensured all three commands are running, now I have added them under supervisor as well.

  • airflow web server
  • airflow planner
  • air flow officer

Things i have tried

  • I tried changing the artist to LocalExecutor

    celery instead of the artist, which didn't help. but what
  • I am currently using redis for queues with setup like: broker_url = redis://myhostname.com:6379/10

    and set up end result celery_result_backend = amqp://guest:guest@localhost:5672

    . I tried various combinations of rabbit-mq and redis for these two settings but it didn't help
  • for redis I've tried using either formats amqp://

    and pyamqp://

    to specify the broker url
  • I tried to change the version of celery, but it resulted in errors. The celery version I am using is celery == 4.0.2

This is a setup done on Ubuntu 14.04.5 LTS, I was able to successfully run a local version of airflow on my mac.

I've been stuck on it for several weeks, can anyone help me figure out / debug this issue?

+3


source to share





All Articles