Creating Airflow Dag using Python Operator | Creating Tag and changing Owner name of Dag #airflow

preview_player
Показать описание
In this video, we'll walk through the process of creating a data pipeline using Apache Airflow and the PythonOperator. We'll start by setting up an Airflow DAG to schedule and execute our data pipeline, and then we'll create a series of PythonOperator tasks to perform our data transformations. We'll demonstrate how to use the PythonOperator to perform common data manipulation tasks, such as filtering and aggregating data, and show you how to chain together multiple tasks to create a complete data pipeline. By the end of this tutorial, you'll have the skills you need to create your own data pipelines using Airflow and the PythonOperator.
#aws #awstutorial #code
from datetime import datetime
from airflow import DAG

def sum():
a=5
b=6
print(f"sum of a+b is {a+b}")
with DAG(dag_id="demo_python",default_args={"owner": "sumit"},start_date=datetime(2023, 4, 28), schedule="0 0 * * *",tags=["python_test", ]) as dag:

sum_task = PythonOperator(
task_id='sum_task',
python_callable=sum
)

sum_task
Рекомендации по теме