I am a newbie and wanted to create a pipeline such that the Function given to Python Operator takes variables and using XCOM push approach, give these variables to Bash Operator. Using Bash Operator, I am trying to run the whole python script which requires these variables taken from XCOM push approach. Could someone help me to create this kind of pipeline.
I am attaching the broken code below. In the below code app.py is the python script which takes count_check and recon_check value. Also can a single bash command pull multiple variables?
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.operators.bash import BashOperator
from airflow.utils.dates import datetime
args= {'owner': 'airflow', 'start_date': datetime(2022, 1, 1) }
def take_args(ti):
count_check = 'Y'
recon_check = 'Y'
ti.xcom_push(key='count', value=count_check)
ti.xcom_push(key='recon', value=recon_check)
with DAG(dag_id='data-validation-dag-python',
default_args=args,
schedule_interval='@daily',
max_active_runs=1,
catchup=False) as dag:
task_1 = PythonOperator(
task_id='Storing_Args',
python_callable= take_args
)
task_2 = BashOperator(
task_id='task_validation_checks',
bash_command='cd /root/airflow/dags && python3 app.py
{{ti.xcom_pull(task_ids[\"Storing_Args\"]) }}',
do_xcom_push=False
)
task_1 >> task_2