What does Microsoft Fabric mean for Power BI?

preview_player
Показать описание
You've heard the news about Microsoft Fabric but what about Power BI? How does Power BI fit in? Do you need to migrate anything? Can you still do what you were doing? Adam tells you what you need to know!

Direct Lake (PREVIEW)

Microsoft Fabric Build 2023 Day 1

Microsoft Fabric Build 2023 Day 2


*******************

Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.


*******************
LET'S CONNECT!
*******************


***Gear***

#PowerBI #MicrosoftFabric #GuyInACube
Рекомендации по теме
Комментарии
Автор

Adam is surprisingly understandable at 1.75x speed. Dam you Microsoft for releasing so many features that I need to watch everything at 1.75 x speed to keep up !

stevefox
Автор

Great video! One thing to keep in mind on Fabric transitioning. Some server locations, including major ones like US Central, etc., are currently not supported for Fabric capabilities in the tenant or capacity. MSFT support has stated that those are likely coming onboard by July 2023, but just FYI for those wondering why you can’t access all of the new capabilities even with a premium capacity and the all checkboxes crossed off!

ZachCunninghamZC
Автор

Awesome.... very well explained with great clarity while we continue to work with Power BI with new capabilities... Thanks a lot 👏

prasadparab
Автор

Awesome job with this video Adam! Thank you.

RushiLadani
Автор

Thank you for the great work as usual. One thing I am struggling to understand is if we will get Copilot even if Fabric is not enabled (I am not the tenant admin). Thanks in advance.

nicolasbelleville
Автор

Great video...I love your content.
New idea for a video : We are hitting the 25GB limit for the large data model in our Power BI Premium P1 capacity refresh for a couple of big dashboards... Before going from a P1 to a P2 (that provides 50GB of limit per dataset), can you guys prepare a video on possible solutions until corporate can purchase the next capacity. Maybe by splitting the large dataset into 4-5 smaller source/datasets using a dataflow, and then combining them into one dataset, is that possible ? (I read somewhere the top limit was 100GB) - really appreciate it...

andresvideo
Автор

Howdy Guys in Cubes, My biggest question about Fabric is how this is going to impact the Microsoft Certifications for Power BI. Are we going to see certification changes like we did with the DA-300 to the PL-300? Cheers!

tofonofo
Автор

Finally git integration for PowerBI. Interested to see how that will work with CICD pipelines across all of Fabric and how workspaces should be structured to support that

mrdavemckay
Автор

Thanks! How does it affect customers who are using Power-bi strictly for on-prem data? How can such customers leverage Fabric?

SSatish
Автор

Do you have instructions on how I can connect my information that is in Microsoft Planner to Microsoft Project? Thanks a million!

lashonalovelyrodgers
Автор

But Fabric is in preview state now. So if we starting to use the power bi in out organization, should we start using power bi as a standalone or as part of fabric (considering that fabric is without normal support now (because of preview state)?

tol
Автор

Hey Adam, thank you for answering that question about Power BI and Fabric, but it raises another question in my mind. Hi always thought that Power BI was part of the Power Platform, but now it sounds like it's part of Fabric. (My head is exploding already!) What is the connection between the Power Platform and Fabric?

willbee
Автор

You could point out what the future of datamarts is. They are not really needed anymore with lakhouses and warehouses can completely replace them? Thoughts on that? Thanks

DanielWeikert
Автор

I am an adminstrator in my Organisation and I have enabled Fabric for the entire Organisation, and it seems like it worked but not totally now when I reload PowerBI I see the Fabric Logo instead of the PowerBI one and the OneLake hub is visible too but I dont see the logo at the bottom left where you can switch the modes, can you please tell me how to fix this. I also have the PowerBI Pro license, and I am located in Germany.

abdosobhy
Автор

Thanks for the video it's helpful. I got one case though, Is it possible when user with pro license onboarded to the powerbi platform can see all the datasets under selected domain in oneLake Datahub without having the access to workspaces. eg if user selects HR domain all the datasets under that domain are listed where user can see the description of it, if its endorsed or certified that also can be seen and then as per information user raises request to get access to the dataset which flows to the owner of the dataset. Actually trying to give overview of all available datasets to user without having access to it, as per information then user creates request to get access to required dataset, is this something possible in OneLake data hub? Appreciate your response

vineetpatil
Автор

Guys question that I can't seem to find an answer to.
With Premium Capacity, when u switch on Fabric do all your existing Workspaces move to One Lake or is that a separate capacity? Many thanks for any clarity.

philmeiklejohn
Автор

Very Informative! So If I currently have a Pro License and only other Pro license coworkers can see my PowerBI reports...if I upgrade to the full Fabric license, can anyone in my company see these reports (say on a SharePoint site) even if they don't have Pro? Struggling to find this answer! thanks

richardobrien
Автор

Excited to see it in action once my enterprise brings it online.

RonDexter
Автор

If I have some reporting need on an SQL Database, should I look at Microsoft Fabric vs Power BI?

georgelbenko
Автор

My company has some data on Azure SQL that refreshes once every 24 hours.
I can write dataflow to bring selected rows into PowerBI.
However I want to create some reusable summary tables for which I think Python would be great.

Is it possible that in Lakehouse,
1- I create Dataflow that brings data from Azure SQL into a table
2 - Create scheduled pipeline that runs every 24 hours and runs the Dataflow that overwrite my table
3 - Use Pyspark or something similar to create the summary table
4 - Write that table to Lakehouse

I am not sure how step 3 and 4 will be automated on schedule? And I am not sure if above is possible at all?

Can you please help?

tinjxqn