Kill an Airflow task running at a remote location via the Airflow interface

The Airflow installation is on EC2 and runs scripts on EMR. If I use the Clear option in the UI, the UI displays the task with a shutdown status, but I can still see the task running in EMR.

The airflow I am using is running LocalExecutor and I wanted to know how to kill the running task.

Should I use the Purge option from the UI to stop the running task? or use a clear task as well as some code changes

Below is my code

def execute_on_emr(cmd):
    f = open(file,'r')
    s = f.read()
    keyfile = StringIO.StringIO(s)
    mykey = paramiko.RSAKey.from_private_key(keyfile)
    sshcon   = paramiko.SSHClient()
    sshcon.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    sshcon.connect(IP, username=username, pkey=mykey)
    stdin, stdout, stderr = sshcon.exec_command(cmd)
    logger.info("stdout ------>"+str(stdout.readlines()))
    logger.info("Error--------->"+str(stderr.readlines()))
    if (stdout.channel.recv_exit_status())!= 0:
        logger.info("Error Return code not Zero:"+ 
        str(stdout.channel.recv_exit_status()))
        sys.exit(1)


Task = PythonOperator(
    task_id='XXX',
    python_callable=execute_on_emr,
    op_kwargs={'cmd': 'spark-submit /hadoop/scripts.py'},
    dag=dag)

      

My question is how to kill / stop a task from the Airflow UI so that the task running on EMR gets killed too.

Thanks in advance.

Hi Chetan

+3


source to share


1 answer


It is implemented like this PR that was recently merged with the master.



If you can run airflow from a master, I suggest you try this. Otherwise, work is underway on a 1.10 release that will include this feature.

0


source







All Articles