Windows Loader

Curious what Dynamic People can do for you?

+ 31 (0) 20 303 24 70
Logo
02 April 2019 Stefan Bekker

Power BI Userday Saturday 30th March

dataflow-integration

Every year the Power BI community in Holland organizes an event at the Jaarbeurs in Utrecht. The first edition was last year, and this event will be part of the calendar in the next couple of years as well. Since I was also part of the volunteer team last year, so I already knew a bit what to expect from this day.

Volunteer team

It is amazing to see so many people participating in this event on a Saturday. 650 people were registered, about 600 people made it eventually. This is 200 more than last year’s event. Now on to the event. As always it starts with the keynote, where some product managers of Power BI were showing key features that they are currently working on. These features will be available for everyone between a couple of months.

Key note

Keynote speaker

Reza Rad, a MVP who also wrote a 1200 page book about Power BI (from Rookie to Rockstar), showed us how to take benefit from personal bookmarks and the growth of Microsoft Power BI during the last couple of years. After this, they also showed a great new addition with Powerapps integrated in Power BI. You can use Powerapps integrated in your Power BI report to update for example a table in a SQL Server database that uses Directquery in your Power BI report. After you approve this new row with the check mark in your Powerapp, your Power BI report will update LIVE in your report.

This is an amazing feature for example for use cases to update some records in your source system to get your data quality to 100%. On the other hand you have to be careful on using this, only the right people should be able to use this new feature.

After this great feature they were explaining how the so called ‘Common Data Model’ works. This goes hand in hand with Dataflows in Power BI. They are working on this with a very big team now and showed some great AI features on top of Dataflows. You can use a lot of features in Azure on top of Dataflows, for example Cognitive services to predict customer satisfaction on top of only a column that contains text in your dataflow.

I see a lot of benefits in Dataflows. I am hoping that Microsoft changes the licensing model in Pro/Premium features here, since many of the cool options that I saw this day requires Premium licensing.

Since I was a volunteer, I had to guard some sessions, but unlike last year I only had to guard 1 session of 4, so I was quite happy to be able to choose which sessions to attend next.

Building enterprise models

For my next session I had to choose between ‘Dataflows’ and ‘Building enterprise models’ presented by Kasper de Jonge, who I think is always delivering up to the expectations. So my choice was to go for Kasper. The key features that he showed here were Calculation Groups, Aggregated Measures and Incremental Refresh.If you have very large data models (talking about terabytes and petabytes of data rows) you have to do some smart things in your models to keep the performance alive and kicking. With Incremental refresh you make sure that only new data gets loaded into the model (not the whole model over and over again). He explained how you can get to change the little details in the script that Power BI uses in Incremental refresh to optimise your refresh times.

After this he showed some things that were new to me, with Manage Aggregations you can actually query your biggest data tables into your Power BI model to create some measures on top of this big data table. But then it gets stored into a separate table so you actually have less rows into this second table.However it uses the same amount of data, so your calculations in these measures are 100% correct. This is also a great new feature that they have implemented, so if we have performance issues this is another key thing to look at.

The final thing he presented was calculation groups. With Calculation groups you can actually call all your measures in your Power BI report into a single measure. For example you have lots of YTD’s,MTD’s and Previous year calculations. If you use Calculation groups you can just drag one measure in your Matrix Visual and the user can then select the time intelligence measures he wants. Next he showed some new measures that will soon be available for all of us to use.

Dataflows Session

After this session I wanted to go to the Dataflows session of Reza Rad. He explained every little detail about the Common Data Model in combination with Dataflows. He also provided 4 different use cases where Dataflows is the right solution. I think this will be a big tool to use in the future, but it won’t replace Datawarehouses if customers already have one. The big benefit of Dataflows is that if you don’t have a Datawarehouse yet, you can actually reduce your ETL time (which is usually 80 % of the time spend by data analysts) to let’s say 50-60 %, because you don’t need a database or Datawarehouse. Dataflows is using Azure Data lake to store all your Dataflows, so you actually query your data from there. Another benefit of Dataflows is that you can directly access some other Azure tools like Machine learning/Cognitive services to have more benefits from your data and take direct actions.

Future data model

Afterlunch I had to guard a session so in this slot I couldn’t pick the one that I wanted to attend. but I was happily supprised that I had to attend the one from Tony since he was going to show us how to handle the difficulties when making an Actuals vs Budget report. He showed how customers often sends us the Budget file in an Excel file where the granularity is different from the stored Actuals in the database.

Actuals are often stored in rows, where customers deliver the budget file into columns (jan to dec in columns with the budget).He showed how you can unpivot this data in Power BI and combine it to your actuals by merging/appending in Power Query. The nice thing about this session was that he explained everything from the perspective of the customer, so actually with baby-steps. This is a great reminder for us, because lots of times we explain so fast and technical to our customers that we just lose themon the way to the end result. So well done Tony, I will definitely keep this in mind in my next presentations/ trainings.

Last session

The Last session I wanted to attend was about Continuous Delivery in Power BI, but unfortunately I had to guard the registration room / partner room. Every downside has its upsides, this way I could talk to the other volunteer/partners. After the last sessions there were still some drinks and we had dinner with some volunteers/speakers and people of the Power BI organisation.

Thank you all for an amazing day, it is great to see so many passionate people who give up their free Saturday just to show their knowledge and knowhow!