SQL Server in Microsoft Fabric? Use CETaS to move data!

preview_player
Показать описание
People have been asking "What about SQL Server?" with regards to Microsoft Fabric. Bob Ward joins us to show a way you can get your SQL data into Fabric with CETaS and the power of shortcuts in OneLake. And, then use that data in Power BI!

CREATE EXTERNAL TABLE AS SELECT (CETAS) (Transact-SQL)

Microsoft SQL, Data Virtualization and Microsoft Fabric (Slides)

Bob Ward:


*******************

Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.


*******************
LET'S CONNECT!
*******************


***Gear***

#MicrosoftFabric #SQLServer #GuyInACube
Рекомендации по теме
Комментарии
Автор

Using SQL Server On-Prem as Power BI source data must be one of the most extended use cases out there. Before Fabric exposing SQL Server tables to Power BI was very straight forward using Power BI Gateways.

Seems that now with fabric we need to create SQL Server partitions, then provision BLOB storage in ADLS, then create a CETaS to create that partition as a parquet file on ADLS, then expose this to OneLake using shortcuts, then format again the parquet virtualized files in OneLake using a delta table, then create the Power BI data model in Power BI and only then you can create a Power BI report out of a SQL Server table. Am I the only one that sees this as some convoluted crazy overengineering process?

Probably there are other ways using Azure Pipelines to copy data from SQL Server On-Prem to OneLake (not in preview I believe). But even when that is GA, the concept of having to copy your on-prem SQL Server structured data to a BLOB storage persistence layer to do simple reporting looks twisted. Companies that want to embrace Fabric are going to have to re-engineer their ETL processes to write on OneLake and deal with a heck of one-time data load for the existing historical data. Just to reach probably a worse performance scenario than the one they already have.

Direct lake connectivity might be the one saving grace on this whole thing... we will have to see. But as of now I'm very sceptical and a bit pissed at MS with their strategy to push everyone (eventually) to this one Lake architecture that in many use cases is absolutely not required.

chainpeu
Автор

This was my second video I've watched. Wowsers, blown away, all that movement of data and zero code.
A+

kayespedal
Автор

Yoooo!!! Great content guys. You guys are making my fabric journey easy to navigate with your amazing content. I love the fabric community

Storm_Troooper
Автор

would love to see a video from you guys showing some examples or best practices how to get data from different sources and databases into Power BI.
For example: we have a CRM running on mysql, a ERP on firebird and a financial ERP on MSSQL. What is the best strategy to bring data togehter in order to analyse it in Power BI. And where?! (in fabric one lake, in an azure sql Datawarehouse, ...)

yanny
Автор

Fabric & one data lake concept is cool

ravindersunkara
Автор

Question: I can easily extract data from a .html financial stocks table using a PySpark notebook w BeautifulSoup crawler, define schema and column headers, create a dataframe and save result in Delta (Stocks.part01, Stocks.part02.., Stocks.part0n) in Lakehouse subfolder under Files. (This could also be achieved using Python .py file as Spark Job Definition w schedule).Still Delta doesn't bring much options other than read (Preview) at this stage. So guess we have not much choice than create a SQL Table in Warehouse using CETAS. Sample shows a single file, but what if we need to... sort of "concatenate" all Delta parts (Stocks.part01, Stocks.part02.., Stocks.part0n) into one table from same Delta location.

denyschamberland
Автор

things are getting very excited! Is there are way to stream live SQL Server production data into Fabric?
We need to report on live data but we don't want to drop and recreate the data sources.. what's the best option for streaming live data into Fabric?

gpltaylor
Автор

Any word on if we will have the ability to see usage metrics when moving items over from SQL into a Fabric Lakehouse or Warehouse similar to how we currently can utilize the Monitoring features in Azure to see Azure SQL server space and DTU usage when performing read/write operations? If I point a Power BI Report/Dataset to the Lakehouse SQL Endpoint and set an hourly refresh rate on that, it would be helpful to compare that read/write usage to the comparable Azure SQL metrics to see what makes sense to use cost-wise for different scenarios.

clinthofeldt
Автор

Important question... are people seriously calling it "The Fabric"?

JasonHurley
Автор

If It aint broke don't fix It. This is just ETL with convoluted extra steps 😅
I am really starting to miss Blob Storage and Gen 2 storage and a fool prrof SQL Database..

If the Datalake filesystem was a GUI, fine. But cases I've seen you need a Scala/Python framework to handle Parquet ingestion/conversion. Oh and querying the data needing an Azure URL string is no fun 😭

dagg
Автор

Am i right in understanding that you can not create shortcuts in a MS Fabric lakehouse to directly access PAAS sql server tables ? So you need complex work-around to make it happen .
What is the story behind this limitation ? Is there a more straightforward solution on it's way in future?

akthar
Автор

Got a specific question and don’t know how to navigate. I have an Azure Manager SQL Instance in the same Resource Group as my Fabric capacity however I cannot for the life of me get Fabric Data Factory to connect to it. I can connect to the instance remotely SSMS. Any idea what to trouble shoot? Looks networking related somehow.

sonicdoomofficial
Автор

"I didn't write a single line of code"- as he shows approximately 60 lines of SQL Scripting.

I think this no-code thing is not quite as ready for primetime as Microsoft wants us to believe.

TedSeeber
Автор

Now how do I convince my IT team, who controls all this but doesn't use any of these tools, that we need this by the end of the week?

kc_cal
Автор

Is there anyway to do this prior to SQL Server 2022?

zorgath
Автор

I wonder what the SQL DB compatibility level needs to be to achieve this.

mnhworkdev
Автор

Just waiting for the perfect world where delta performs as well as, or better, than an indexed relational data warehouse.
I'm skeptical of Microsoft's use of delta in Fabric Warehouse - and I hope it improves, the performance has been poor. It seems like Synapse Serverless.

joshash
Автор

I want to ask that as on one click fabric show dashboard report of power bi same as on one click fabric can show sql report mean as we find total sales, total order, max sales etc same these and other sql query can find in one click in fabric if this happen then it will be great then we dont need to learn sql we can get all sql queries in one click in fabric pls give my question

Khan-mcgi
Автор

How long until the Azure SQL table can exist as a Fabric table (the lines are becoming blurred).

robbief
Автор

the same thing that we have already in synapse analystics but now we can use a new word...shortcut....

vincenzobuono