通过Airflow DAG的数据流作业

问题描述 投票:0回答:1

[我正在尝试通过Airflow中的BashOperator使用数据流运行器执行apap梁管道python文件。我已经知道如何将参数动态传递给python文件。我期待优化参数-避免单独发送所有参数。示例代码段:

text_context.py

import sys

def run_awc_orders(*args, **kwargs):
    print("all arguments -> ",  args)

if __name__ == "__main__":
    print("all params -> ", sys.argv)
    run_awc_orders( sys.argv[1],  sys.argv[2], sys.argv[3])

my_dag.py

test_DF_job = BashOperator(
    task_id='test_DF_job',
    provide_context=True,
    bash_command="python /usr/local/airflow/dags/test_context.py {{ execution_date }} {{ next_execution_date }} {{ params.db_params.new_text }}  --runner DataflowRunner --key path_to_creds_json_file --project project_name --staging_location staging_gcp_bucket_location --temp_location=temp_gcp_bucket_location --job_name test-job",
    params={
              'db_params': {
                'new_text': 'Hello World'
              }
            },
    dag=dag
)

因此,这是我们在气流UI的日志中看到的。

[2019-09-25 06:44:44,103] {bash_operator.py:128} INFO - all params ->  ['/usr/local/airflow/dags/test_context.py', '2019-09-23T00:00:00+00:00', '2019-09-24T00:00:00+00:00', '127.0.0.1']
[2019-09-25 06:44:44,103] {bash_operator.py:128} INFO - all arguments ->  ('2019-09-23T00:00:00+00:00', '2019-09-24T00:00:00+00:00', '127.0.0.1')
[2019-09-25 06:44:44,106] {bash_operator.py:132} INFO - Command exited with return code 0
python google-cloud-dataflow airflow apache-beam
1个回答
0
投票

我相信推荐的方法是使用Airflow's DataflowPythonOperator,它直接接收Python和Dataflow选项。

您会做类似的事情:

DataflowPythonOperator
© www.soinside.com 2019 - 2024. All rights reserved.