Airflow kubernetesPorOperator示例无法运行

问题描述 投票:0回答:1

尝试运行示例kubernetesPodOperator检索:

[2020-05-25 20:00:40,475] {{init。py:51}}信息-使用执行程序本地执行器[2020-05-25 20:00:40,475] {{dagbag.py:396}}信息-填满来自/usr/local/airflow/dags/kubernetes_example.py的DagBag││回溯(最近通话最近):││文件“ / usr / local / bin / airflow”,第37行,在││args.func(args)││文件“ /usr/local/lib/python3.7/site-packages/airflow/utils/cli.py”,第75,在包装纸中││返回f(* args,** kwargs)││文件“ /usr/local/lib/python3.7/site-packages/airflow/bin/cli.py”,第523行,运行中││dag = get_dag(args)││文件“ /usr/local/lib/python3.7/site-packages/airflow/bin/cli.py”,第149行,在get_dag中││'parse。'。format(args.dag_id))││airflow.exceptions.AirflowException:找不到dag_id:kubernetes_example。要么dag不存在,要么失败解析。

这是我正在使用的代码:

from airflow import DAG
from datetime import datetime, timedelta
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.operators.dummy_operator import DummyOperator
from airflow.utils.dates import days_ago



default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': days_ago(1),
    'email': ['[email protected]'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=60)
}

dag = DAG(
    'kubernetes_example', default_args=default_args, schedule_interval=timedelta(minutes=60))


start = DummyOperator(task_id='run_this_first', dag=dag)

passing = KubernetesPodOperator(namespace='airflow',
                          image="python:3.6.10",
                          cmds=["Python","-c"],
                          arguments=["print('hello world')"],
                          labels={"foo": "bar"},
                          name="passing-test",
                          task_id="passing-task",
                          env_vars={'EXAMPLE_VAR': '/example/value'},
                          in_cluster=True,
                          get_logs=True,
                          dag=dag
                          )

failing = KubernetesPodOperator(namespace='airflow',
                          image="ubuntu:18.04",
                          cmds=["Python","-c"],
                          arguments=["print('hello world')"],
                          labels={"foo": "bar"},
                          name="fail",
                          task_id="failing-task",
                          get_logs=True,
                          dag=dag
                          )

passing.set_upstream(start)
failing.set_upstream(start)

我只是从示例执行者那里得到的。有人偶然发现了这个问题吗?

谢谢!

airflow airflow-scheduler airflow-operator
1个回答
0
投票

您需要有一个名字(dag的dag_id)。

dag = DAG(
    dag_id='kubernetes_example', 
    default_args=default_args, 
    schedule_interval=timedelta(minutes=60)
)

也您的task_id应该具有_而不是-,并且应为:task_id="failing_task"

© www.soinside.com 2019 - 2024. All rights reserved.