performance problem power bi ! is the cube ssas a solution ?

Hanae TAHRI 6 Reputation points

I am currently working on a project to set up dashboards on Power BI, the model is simple, a single large table is imported to power BI which contains all the main indicators and the axes of analysis, other measurements are shouted in Power BI. And another table of less volume.
I'm starting to have performance problems especially with the power bi desktop given the size of the table (around 17 M lines) (memory problem, latency ..).
• Can having a cube and a live connection instead of import solve this performance problem?
• What are the best practices ?

Thanks for your help

SQL Server Analysis Services
SQL Server Analysis Services
A Microsoft online analytical data engine used in decision support and business analytics, providing the analytical data for business reports and client applications such as Power BI, Excel, Reporting Services reports, and other data visualization tools.
1,061 questions
Not Monitored
Not Monitored
Tag not monitored by Microsoft.
25,564 questions
{count} votes

2 answers

Sort by: Most helpful
  1. CarrinWu-MSFT 6,806 Reputation points

    Hi @Hanae TAHRI ,

    Thanks for your posting!

    Your question is related to Power BI, so I suggest that please post your question in Microsoft Power BI Community that can help you get answers more quickly. Thank you.

    In addition, from SSAS aspect, it is possible to improve Power BI performance via SSAS. For more information, please refer to below link:
    Microsoft Power BI: Tabular Model Optimization

    If a model is becoming too large and the query duration rises as the model size grows it can become interesting to add an aggregation table. Such a table will be aggregated before being loaded into a tabular database and will give a more high-level view of the data. This way reporting will be faster because a lot of the calculations are done beforehand. As a downside model processing will take longer.

    This can also be done for calculated tables and columns. These increase model size and result in longer refresh times. If possible, it is always best to limit the use of calculated objects and create/calculate them in the data source. Even if the calculation of this table or column happens in a view, it only needs to be calculated when the model is processed.

    Keep your model slim
    Remove unnecessary tables and columns. They take up resources and raise processing time. It is better to add them later. Adding something is fairly easy.

    Multi-model design
    Avoid using only one model when multiple models make more sense. This will allow you to optimize each model for its specific purpose. Optimizing one model for recent detailed data and another model for a global high-level overview will result in a much better performance than trying to fit it all into one model.

    Properly set meta-data properties
    Change the “Summarize By” property for all fields that do not need to be summarized. A field like Year will automatically be recognized as a numeric field and Power BI will summarize this field. Which is annoying for users and has a serious negative impact on performance if data is shown on a low level of granularity. For big models it is advised to avoid implicit “Summarize By” measures and always create an explicitly defined measure.

    Best regards,

    If the answer is the right solution, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".
    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

  2. Darren Gosbell 2,366 Reputation points

    Can having a cube and a live connection instead of import solve this performance problem?

    No you can't significantly change the performance just by moving the data model to SSAS. Power BI and SSAS use the same engine so if you have performance issues in Power BI you will most likely have the same issues in SSAS. 17 million rows is not all that large for a tabular model (I have a demo file with 250 million records that runs just fine).

    Your issues are most likely related to your model design, having everything in one large table often leads to overly complicated DAX measures. Your best bet would be to use a Kimball style star schema. It's really hard to help with generic performance issues on a forum. You may be better off engaging an experienced consultant or looking at getting some training.