Learn Together: Ingest data with Spark and Microsoft Fabric notebooks

preview_player
Показать описание


Follow on Microsoft Learn:


Discover how to use Apache Spark and Python for data ingestion into a Microsoft Fabric lakehouse. Fabric notebooks provide a scalable and systematic solution.

---------------------

Learning objectives

- Ingest external data to Fabric lakehouses using Spark
- Configure external source authentication and optimization
- Load data into lakehouse as files or as Delta tables

---------------------

Chapters
--------
00:00 - Introduction
05:49 - Learning objectives
06:22 - Context in Fabric
10:32 - What is Spark?
14:55 - An introduction to notebooks
16:43 - Explore Fabric notebooks
22:29 - What is a lakehouse?
25:45 - Fabric lakehouse vs Data warehouse
28:48 - Write data into a lakehouse
30:58 - Write to a Delta table
34:14 - Consider uses for ingested data
42:02 - Exercise - Ingest data with Spark and Microsoft Fabric notebooks
1:04:52 - Knowledge check
1:13:25 - Summary

---------------------

Presenters

Johan Ludvig Brattås
Director, Microsoft MVP
Deloitte

Heini Ilmarinen
Azure Lead, Microsoft MVP
Polar Squad

Moderators

Marcel Magalhães
Power BI Consultant, Microsoft Super User
G2M Consultoria e Treinamento


Olivier Van Steenlandt
Core Member, Microsoft MVP
Рекомендации по теме
Комментарии
Автор

Tremendoudly Impressed With Notebook Experience in Fabric. Loved it.

XLSavvy
Автор

No Support for Recursion and Geospatial datatypes.

adilmajeed
Автор

The lakehouse has the Tables and the Files in the Fabric. What are they? The Tables is a data warehouse and the Folders is a lakehouse? The lakehouse has a data warehouse underneath.

juliekim