尝试根据文件名数组从父级dag创建动态subdag


10

我正在尝试使用气流将s3文件从“非删除”存储桶(意味着我无法删除文件)移动到GCS。我不能保证每天都会有新文件,但是我必须每天检查新文件。

我的问题是动态创建subdag。如果有ARE文件,则需要subdags。如果没有文件,则不需要子目录。我的问题是上游/下游设置。在我的代码中,它确实可以检测文件,但不会像预期的那样启动子数据。我想念一些东西。

这是我的代码:

from airflow import models
from  airflow.utils.helpers import chain
from airflow.providers.amazon.aws.hooks.s3 import S3Hook
from airflow.operators.python_operator import PythonOperator, BranchPythonOperator
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.subdag_operator import SubDagOperator
from airflow.contrib.operators.s3_to_gcs_operator import S3ToGoogleCloudStorageOperator
from airflow.utils import dates
from airflow.models import Variable
import logging

args = {
    'owner': 'Airflow',
    'start_date': dates.days_ago(1),
    'email': ['sinistersparrow1701@gmail.com'],
    'email_on_failure': True,
    'email_on_success': True,
}

bucket = 'mybucket'
prefix = 'myprefix/'
LastBDEXDate = int(Variable.get("last_publish_date"))
maxdate = LastBDEXDate
files = []

parent_dag = models.DAG(
    dag_id='My_Ingestion',
    default_args=args,
    schedule_interval='@daily',
    catchup=False
)

def Check_For_Files(**kwargs):
    s3 = S3Hook(aws_conn_id='S3_BOX')
    s3.get_conn()
    bucket = bucket
    LastBDEXDate = int(Variable.get("last_publish_date"))
    maxdate = LastBDEXDate
    files = s3.list_keys(bucket_name=bucket, prefix='myprefix/file')
    for file in files:
        print(file)
        print(file.split("_")[-2])
        print(file.split("_")[-2][-8:])  ##proves I can see a date in the file name is ok.
        maxdate = maxdate if maxdate > int(file.split("_")[-2][-8:]) else int(file.split("_")[-2][-8:])
    if maxdate > LastBDEXDate:
        return 'Start_Process'
    return 'finished'

def create_subdag(dag_parent, dag_id_child_prefix, file_name):
    # dag params
    dag_id_child = '%s.%s' % (dag_parent.dag_id, dag_id_child_prefix)

    # dag
    subdag = models.DAG(dag_id=dag_id_child,
              default_args=args,
              schedule_interval=None)

    # operators
    s3_to_gcs_op = S3ToGoogleCloudStorageOperator(
        task_id=dag_id_child,
        bucket=bucket,
        prefix=file_name,
        dest_gcs_conn_id='GCP_Account',
        dest_gcs='gs://my_files/To_Process/',
        replace=False,
        gzip=True,
        dag=subdag)


    return subdag

def create_subdag_operator(dag_parent, filename, index):
    tid_subdag = 'file_{}'.format(index)
    subdag = create_subdag(dag_parent, tid_subdag, filename)
    sd_op = SubDagOperator(task_id=tid_subdag, dag=dag_parent, subdag=subdag)
    return sd_op

def create_subdag_operators(dag_parent, file_list):
    subdags = [create_subdag_operator(dag_parent, file, file_list.index(file)) for file in file_list]
    # chain subdag-operators together
    chain(*subdags)
    return subdags

check_for_files = BranchPythonOperator(
    task_id='Check_for_s3_Files',
    provide_context=True,
    python_callable=Check_For_Files,
    dag=parent_dag
)

finished = DummyOperator(
    task_id='finished',
    dag=parent_dag
)

decision_to_continue = DummyOperator(
    task_id='Start_Process',
    dag=parent_dag
)

if len(files) > 0:
    subdag_ops = create_subdag_operators(parent_dag, files)
    check_for_files >> decision_to_continue >> subdag_ops[0] >> subdag_ops[-1] >> finished


check_for_files >> finished

这些DAGS后端运行的是什么样的工作,这些spark工作或python脚本是什么,您使用它来运行它的方式livy或其他方法
ashwin agrawal

对不起,我不明白这个问题。你能重述一下吗?
arcee123

我的意思是,您仅使用简单的python脚本,而不使用任何Spark工作,对不对?
Ashwin agrawal

是。气流默认的简单操作员。我想基于要导入GCS的S3中的标记文件以动态速率添加现有运算符。
arcee123

为什么是files空列表?
Oluwafemi Sule

Answers:


3

以下是在气流中创建动态DAG或sub-DAG的推荐方法,尽管还有其他方法,但是我想这将在很大程度上适用于您的问题。

首先,创建一个(yaml/csv)包含所有s3文件和位置列表的文件,如果您编写了一个将它们存储在列表中的函数,我会说将它们存储在一个单独的yaml文件中,并在运行时将其加载到airflow env中,然后创建DAG。

下面是一个示例yaml文件: dynamicDagConfigFile.yaml

job: dynamic-dag
bucket_name: 'bucket-name'
prefix: 'bucket-prefix'
S3Files:
    - File1: 'S3Loc1'
    - File2: 'S3Loc2'
    - File3: 'S3Loc3'

您可以修改Check_For_Files功能以将它们存储在yaml文件中。

现在我们可以继续动态创建dag了:

首先使用伪运算符定义两个任务,即开始和结束任务。这些任务是我们将DAG在它们之间动态创建任务的基础上:

start = DummyOperator(
    task_id='start',
    dag=dag
)

end = DummyOperator(
    task_id='end',
    dag=dag)

动态DAG:我们将PythonOperators在气流中使用。该函数应接收任务ID作为参数。待执行的python函数,即Python运算符的python_callable;以及在执行期间要使用的一组args。

包含一个参数task id。因此,我们可以在以动态方式(例如通过)生成的任务之间交换数据XCOM

您可以在此动态dag之内指定操作功能s3_to_gcs_op

def createDynamicDAG(task_id, callableFunction, args):
    task = PythonOperator(
        task_id = task_id,
        provide_context=True,
        #Eval is used since the callableFunction var is of type string
        #while the python_callable argument for PythonOperators only receives objects of type callable not strings.
        python_callable = eval(callableFunction),
        op_kwargs = args,
        xcom_push = True,
        dag = dag,
    )
    return task

最后,根据yaml文件中的位置,您可以yaml创建动态dag ,首先按如下所示读取文件并创建动态dag:

with open('/usr/local/airflow/dags/config_files/dynamicDagConfigFile.yaml') as f:
    # use safe_load instead to load the YAML file
    configFile = yaml.safe_load(f)

    #Extract file list
    S3Files = configFile['S3Files']

    #In this loop tasks are created for each table defined in the YAML file
    for S3File in S3Files:
        for S3File, fieldName in S3File.items():

            #Remember task id is provided in order to exchange data among tasks generated in dynamic way.
            get_s3_files = createDynamicDAG('{}-getS3Data'.format(S3File), 
                                            'getS3Data', 
                                            {}) #your configs here.

            #Second step is upload S3 to GCS
            upload_s3_toGCS = createDynamicDAG('{}-uploadDataS3ToGCS'.format(S3File), 'uploadDataS3ToGCS', {'previous_task_id':'{}-'})

#write your configs again here like S3 bucket name prefix extra or read from yaml file, and other GCS config.

DAG的最终定义:

这个想法是

#once tasks are generated they should linked with the
#dummy operators generated in the start and end tasks. 
start >> get_s3_files
get_s3_files >> upload_s3_toGCS
upload_s3_toGCS >> end

完整的气流代码顺序为:

import yaml
import airflow
from airflow import DAG
from datetime import datetime, timedelta, time
from airflow.operators.python_operator import PythonOperator
from airflow.operators.dummy_operator import DummyOperator

start = DummyOperator(
    task_id='start',
    dag=dag
)


def createDynamicDAG(task_id, callableFunction, args):
    task = PythonOperator(
        task_id = task_id,
        provide_context=True,
        #Eval is used since the callableFunction var is of type string
        #while the python_callable argument for PythonOperators only receives objects of type callable not strings.
        python_callable = eval(callableFunction),
        op_kwargs = args,
        xcom_push = True,
        dag = dag,
    )
    return task


end = DummyOperator(
    task_id='end',
    dag=dag)



with open('/usr/local/airflow/dags/config_files/dynamicDagConfigFile.yaml') as f:
    configFile = yaml.safe_load(f)

    #Extract file list
    S3Files = configFile['S3Files']

    #In this loop tasks are created for each table defined in the YAML file
    for S3File in S3Files:
        for S3File, fieldName in S3File.items():

            #Remember task id is provided in order to exchange data among tasks generated in dynamic way.
            get_s3_files = createDynamicDAG('{}-getS3Data'.format(S3File), 
                                            'getS3Data', 
                                            {}) #your configs here.

            #Second step is upload S3 to GCS
            upload_s3_toGCS = createDynamicDAG('{}-uploadDataS3ToGCS'.format(S3File), 'uploadDataS3ToGCS', {'previous_task_id':'{}-'})

#write your configs again here like S3 bucket name prefix extra or read from yaml file, and other GCS config.


start >> get_s3_files
get_s3_files >> upload_s3_toGCS
upload_s3_toGCS >> end

非常感谢。所以我遇到的问题之一是如果没有新文件会发生什么?我面临的问题之一是,在此位置始终会有文件,但不能保证要提取新文件,这意味着该节upload_s3_toGCS将不存在,并且气流错误。
arcee123

您可以通过将yaml所有这些文件上传到GCS后从文件中删除文件来解决问题,这样,文件中将仅存在新yaml文件。如果没有新文件,yaml文件将为空,并且不会创建动态dag。这就是为什么yaml与将文件存储在列表中相比,文件是更好的选择的原因。
ashwin agrawal

yaml文件还将以某种方式帮助维护s3文件的日志记录,如果假设某些s3文件未能上载到GCS,那么您还可以维护对应于该文件的标志,然后在下一次DAG运行时重试这些文件。
ashwin agrawal

如果没有新文件,则可以if在DAG之前放置一个条件,该条件将检查以下文件中的新文件:yaml如果有新文件是否有新文件,请执行该条件,否则将其跳过。
ashwin agrawal

这里的问题是设置了下游。如果下游设置没有实际作业(因为没有文件),则会出错。
arcee123
By using our site, you acknowledge that you have read and understand our Cookie Policy and Privacy Policy.
Licensed under cc by-sa 3.0 with attribution required.