3 ways to REDUCE DATA In Power BI Desktop (No Premium needed!)

preview_player
Показать описание
Sometimes you want to reduce data in Power BI Desktop so it's easier to work with. Then work with all the data in the Power BI Service. Patrick shows you 3 ways that can help do this without using Incremental Refresh or needing Power BI Premium!


*******************

Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.


*******************
LET'S CONNECT!
*******************


***Gear***

#PowerBI #ReduceData #GuyInACube
Рекомендации по теме
Комментарии
Автор

Your videos has really helped me grow in power bi practice in my organisation.. thank you so much😀

hatimali
Автор

I have been using date parameters for a proof of concept for my client and it works very well. It allowed me to avoid technical limitations of the relational database in use at the organization. Once in the service, all the data was pulled. I didn't know there was a simple way to limit the number of rows directly via Power Query. I'll test that now. Thank you for the useful information.

dinoben
Автор

Thanks so much... spent so much time working in editor making small changes and waiting for the changes to apply on 6MM rows... cheers

SlothyUK
Автор

Thank you so much Patrick😀 This is really helpful! Would love to see videos on customer attrition analysis. Cheers👍

apekshaawaji
Автор

Another amazing video! Great job Patrick! I always wondered how models with large data sets were created and published without a powerful machine.

vpwl
Автор

Hi Patric, thank you so mucho. I was looking for this information before

miguelguillenpaz
Автор

Hi Patric, great video as always. I totally forgot about query folding, as I'm usually making custom SQL queries that PBI can't fold anyway... so I'm usually injecting my parameter into SQL query (where parameter) directly. Unfortunately, this approach makes SQL query quite badly human-readable (adding lots of code like #(lf) instead of new line etc.). Thank's for bringing up again the possibility to fold our queries in an intuitive way.

otakarveleba
Автор

i was looking for this trick for a long time. Hours and hours wasted waiting for refresh. Thank youu !

ayouberrarhbi
Автор

I was using exactly this approach (parameter), I'm glad I'm not the only one.

TitusRex
Автор

Really useful tips that will help people working from home and working with slow broadband through vpn develop and deploy reports and take less time waiting for rows to load, thanks Patrick 😊

sands
Автор

If only I was allowed to use the service!!! So many things would be possible. Great content and I will keep it in mind if I ever get them to change their minds.

pabeader
Автор

I love the fact that you can select the table that you want to load.. as in the one which is filtered or not filtered... I am using this feature for testing putt he load with fewer rows.

intelabhinav
Автор

Thanks for great idea (2-nd way with parameter)!
I can't use it "as is" because Report Server (on-premises) haven't parameter's option for scheduled refresh. But (as always) xlsx-file with number of rows in network folder works well :)
According to Matthew's "upstream" maxima I included number of rows from xlsx-file in SELECT-section of query. Now I know that "first 100 rows" has different translation for MS SQL-source and for Oracle-source. Interesting experience!

KateMikhailova
Автор

thank you for this - I'm going to try and set up some refresh parameters now on the report I'm building because it links to Salesforce but I only need to see one of the territories/divisions so hopefully, hopefully, this will help reduce the thousands of lines it pulls through!

pinstripeowl
Автор

Another easy way to reduce dataset size without reducing number of rows is to convert decimal values to whole numbers (assuming you don't need the additional accuracy)

samiphoenix
Автор

Hello Patrick. Interesting stuff. Do you have any advice on optimizing performance between Power BI Desktop and AWS Athena using ODBC connection? Would the reduce rows option you showed be a good option in case of 150+ million rows? Appreciate your feedback.

aliramadan
Автор

Done something similar with the parameter values. Instead, we set up two dataflow entities to work off of, a full set of the data, and a sample set of the data. Names Source & Source_Sample.

Then we can just parameterize the source. So start with the parameter point to Source_Sample, when it's all done and published - update parameter to Source. I think this way, as it can standardize the process for all report builders in our company and we can ensure they're getting a good distribution of the data within the Sample. Report writers just toss in a couple standard lines in Advanced Editor (copy/paste) and they're all set up. We don't need to rely on them to find janky ways to try to get their own sample set.

tymccurdy
Автор

That is some great advice - thank you.

I was also wondering in the third way using start and end dates; are you able to dynamically set the values to be current date for end date and say the date starting from the last 3 months?

This is so that once you are done with uploading the report and you have set the schedule the start and end date parameter in power bi service could always use the current and last date (which will be different every day).

getusama
Автор

This channel is making power bi no1 in the world.

premjitchowdhury
Автор

Hi Patrik, I need a solution for monitoring scheduled refresh from multiple workspaces.. can you please make a video on this...

naveenkumarm.r
visit shbcf.ru