[Tutor] airflow dag

shubham goyal skgoyal721 at gmail.com
Thu May 25 08:15:23 EDT 2017


He guys,

I want to ask that can we pass the parameters as commandline arguments in
airflow when we are triggering the dag and access them inside the dag's
python script/file.
script:

from airflow import DAG
from datetime import datetime,timedelta
default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime.now(),
    'email': ['airflow at airflow.com'],
    'email_on_failure': False,
    'email_on_retry': False
}
MAIN_DAG='check_dag'
dag = DAG(dag_id=MAIN_DAG, default_args=default_args,
schedule_interval=None)

with open(file, "r") as f:
    payload = f.read()  # Reading the json data from a file
    SimpleHttpOperator(  # creating cluster using SimpleHttpOperator
        task_id='cluster_create',
        method='POST',
        http_conn_id='qubole_default',
        # for directing to https://qa.qubole.net/api
        endpoint='/v2/clusters?auth_token=%s' % (passwd),
        data=payload,
        headers={"Content-Type": "application/json"},
        params={'auth_token': passwd},
        response_check=lambda response: True if response.status_code == 200
else False,
        dag=dag
    )

like this here i am trying to create a cluster but i need to pass password
as cli arguments when i trigger the dag. can we do that. please help.


More information about the Tutor mailing list