Analyzing a slow report query in DAX Studio

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Thanks Marco for this explaining such a critical topic in such simple terms

nehachak
Автор

Thanks for sharing. PowerBI is such a clunky assembly of tools, I pine for the simplicity and efficiency of Tableau.

SteveFirefly
Автор

Absolutely fantastic! I can't tell you how much this video helped me to optimize some of my measures which I have been struggling with to make faster!

evnpresson
Автор

As usual, you are a genius! I had a crazy query that someone else wrote and the visual was timing out, I updated to KEEPFILTERS and same results but actually runs now :). Thank you!

rickcoterie
Автор

Excellent explanation Marco, this video should be reproduced in every #PowerBI event, training course, webinar, etc. as a preamble.

Fernando_Calero
Автор

Wow, amazing.. really.. you are a genius, i never thought by using DAX studio, we can optimise our query and improve the performance. Really, this was the new way around to use the DAX studio. Thank you for sharing your extensive knowledge on Power BI and DAX.

vijayvizzu
Автор

Thank you so much, this improved my visual load times by an unbelievable amount of time

josbar
Автор

love these bite size chunks of DAX learning, thanks

csterling
Автор

Thanks for sharing great insight and advanced use of query optimization techniques!

mintheinaung
Автор

Great one Marco
Yes i did understand that filter function for table takes the performance, but i didn't understand the replacement with the keepfilters function

DIGITAL_COOKING
Автор

Thank you for a great video. Could you please tell where to find the list of all best practices combined? For example, I use a function called DistinctCount(), but in one of your videos I find that it is very heavy function for Power BI to execute. Better to use Sumx(values(____), 1) (depending on the context of usage).

galymzhankenesbekov
Автор

Amazing! Your videos are awesome! Thanks for sharing!

DanielADamico
Автор

Hello Marcos! Thanks for sharing this amazing topic! Follow your steps I got stick when tried running Query Plan and Server Timing. I have got a message stating that "Timeout exceeded attempting to start." I increased timeout options, but I am still getting the same error. Can you kindly tell me what going on?

luisfernandomacedo
Автор

Great !!! Thanks a lot. What about if FE time is much much bigger than SE time? Where may the problem be?
Thanks

nachomiranda
Автор

Can you provide a link for the best practice of the Calculate DAX which you mentioned in the video at 8:37

nimaiahluwalia
Автор

Can you show one version of analyzing measure from power pivot. I have couple measures depending on filter context. First thing that is unclear to me is how you can get dax query from measure in power pivot? Second is how to analyze this measure in dax studio, when her calculation is dependent on filter context? I probably missed something, but that is why I have to ask. Every time you want to analyze, you take copy of dax query from power bi, but is it possible from pivot?

filecile
Автор

8:36 how do you know when a column is a filter column?

craniumnr
Автор

Hello Marco, thank you for the video, im struggling with analysis services because i have a server with 6 core but my tabular constantly uses only one core.
Is there a way to enable multi core or something? Thanks

davideperina
Автор

I have different issue that does not seem solvable by using DAX Studio.

SSAS model

Dim Customer is 18 million (nothing I can do to fix that as it is used in multiple fact tables)

Column in Dim Customer STATUS (Alive, Deceased)

Fact table 1 has 4 million rows for this report

Linked Fact Customer ID. Dim Customer Customer ID

On a PBI report if use flyout filter box and set filter ALIVE, it dramatically slows down output to output to 30 secs.

But it is not a measure, so I can’t rewrite it.



I know just by using the report that anytime I added filter for status it slows it down, so I already have a head start on issue.



If I copy DAX from the visual put in DAX Studio and go to server timings.



SET DC_KIND="AUTO";

SELECT

'dim_Customer'[Ent_ID],

'dim_Customer'[Age Current]

FROM 'dim_Customer'

WHERE

'dim_Customer'[Status] = 'Alive';



It is written behind the scenes as:

VAR __DS0FilterTable4 =
TREATAS({"Alive"}, 'dim_Customrt'[Status])





Estimated size: rows = 18, 395, 979 bytes = 73, 583, 916

CPU 17, 344



The only things I know would intuitively work

Option 1: put STATUS into the FACT table. I don’t want to go this route.

Option 2: have MULTIPLE Customers dimensions for each FACT table would decrease the scan for this particular fact table. This just adds to model complexity and overall size.

zxccxz
Автор

Hello Marco, I am struggling with a performance improvement on a measure build on top of a snapshot fact table containing 2 billion rows. I want to perform a Sum of a value but only in a single day (the last day of the Time context). The formula looks like this:
CALCULATE (
SUM ( 'MyTable'[volume] ),
FILTER (
'MyTable',
'MyTable'[Date] = MAX ( 'MyTable'[Snapshot Date] )
)
) .
Based on the example from this video is there a way in which I could rewrite the measure to filter only on column not the entire table?
Thanks!

Ionesta