Microsoft Fabric: Load on-premise data file to Lakehouse using on-premise Python using OAuth Token

preview_player
Показать описание
In this video of Microsoft Fabric, I will show to load on-premise data files to Microsoft Fabric Lakehouse using on-Premise Python code with MFA. The Python code will OAuth2 Token to load the file at the target

Microsoft has Microsoft Fabric, the next-generation Analytic Platform. Check out how to enable it on Power BI Service. How to start a Fabric (Preview) trial
The Microsoft Fabric Platform provides customers with a SaaS-ified, open, lake-centric(One Lake), full-featured data, analytics, and AI platform that meets all their data estate needs. Power BI, Synapse Data Warehouse, Data Factory, Spark, and Notebook all under one platform

00:00 What we plan to do
04:00 Azure Token Generation
07:30 Understand the Python Code
15:30 Load the Data in Microsoft Fabric Lakehouse
16:30 Load one more file

🔵 What is Microsoft Fabric
Microsoft Fabric is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place.

With Fabric, you don't need to piece together different services from multiple vendors. Instead, you can enjoy a highly integrated, end-to-end, and easy-to-use product that is designed to simplify your analytics needs.

Don't forget to like, comment, and subscribe for more great content!
▶️Data:

---------
▶️Follow me on:
---------
▶️My Other Videos:

-~-~~-~~~-~~-~-
Please watch: "Microsoft Power BI Tutorial For Beginners✨ | Power BI Full Course 2023 | Learn Power BI"
-~-~~-~~~-~~-~-
Рекомендации по теме
Комментарии
Автор

Amit thanks for this excelent tutorial. When I run the code with the appropriate items for my fabric instance plugged in I am getting the error "Error in getting in access token". I notice at around 6:27 in your video you have a bunch of other API permissions granted (e.g. PowerBI Services - App.Read.All, Do I also need to add those to my API permissions?? Just trying to figure out what I am doing wrong. Thanks for your assistance!

frankmroberts
Автор

Hi Amit, can we append the data to the one lake csv file instead of replacing the file itself. If yes, can you please let me know how ?

harshavardhan
Автор

Hi amit, That was great way to import data.. One question, can we import many sales persons files into Lake House after concatenation or clubbing.
. Please help create a pipeline or data flow or using python we can import raw data of many sales person data in different folders in my SharePoint.

subratkumar
Автор

Hi Amit, do we need to enable the grant admin consent in app registration, I am unable to see the option in app registration .iam facing the errror in python code :

Traceback (most recent call last):, line 130, in <module>
access_token = get_access_token(app_id=app_id, client_secret=client_secret, directory_id=directory_id, user_name="emailid", password="password")
line 74, in get_access_token
raise Exception("Error in getting in access token")
Exception: Error in getting in access token

getting this error

karrisaikumar
Автор

Hi Amit, The instructions were so helpful.

I wanted to move files and lakehouses from one workspace, which is associated with one mail ID, to another workspace, which is associated with a different mail ID. I don't want to give permission to a second account; instead, I want to move the entire workspace.

From your approach, I have thought of two steps to achieve it.
1. Use the modified above Python approach to Get data from workspace 1 (don't want to save it in local storage).
2. Use the data (or as a function or API endpoint) from workspace 1 to ingest the data into workspace 2 with the above Python approach.

Is it possible to do it? I need guidance on how to get data from Fabric OneLake. Kindly share your thoughts. Thank you!

GokulakrishnanM-SC
Автор

Thank you 🙏 I dont know why but for me it did'nt work with secret id. I had to put Client value instead then it worked.

faheemiftikhar
join shbcf.ru