Build A Data Pipeline - Airflow, dbt, snowflake and more!

preview_player
Показать описание
Data Engineering with Apache Airflow, Snowflake & dbt

Timestamps:

0:00 Introduction
0:43 Self-promotion
1:09 Snowflake Background
3:21 dbt Background
5:55 Apache Airflow Background
6:45 Github Background
7:35 Docker Background
7:50 High Level Diagram
9:06 Prerequisites
9:37 git clone
10:34 Configuration
10:55 High Level Detail
12:24 docker-compose build
13:23 docker-compose up
13:39 docker ps
13:58 docker exec -it [] /bin/bash
14:30 Apache Airflow Web Interface
15:36 snowflake integration
15:45 Conclusion

#dataengineering
#airflow
#dbt
#snowflake
#github
#docker

This video showcases a repository with a data pipeline using Apache Airflow, Snowflake, and dbt. Learn to manage data workflows with Airflow, transform data in Snowflake using dbt, and integrate Docker for easy setup.

Key Features:

Airflow orchestration with Docker
dbt data transformations in Snowflake
GitHub integration for version control

Requirements:

Snowflake account
GitHub repository
Docker & Visual Studio Code
Build a scalable, modern data pipeline by following this concise guide!

run this command in your terminal to clone this project

Meetup:

Calendar:

Рекомендации по теме
Комментарии
Автор

Hey everyone! I'd love to hear about the pipelines you're working on and any content you'd like to see. Wishing you all the best in your projects!

DataSolve-uz