Hi,
Your model sounds well designed and your measure looks fairly simple and clear. The RAM usage is not high because your tabular size is relatively small as 180MB, it is reasonable. For now this has very limited performance improvement that can be applied.
What performance hits are you getting now? Do we have specific problem or goal here ?
Performance Issues SSAS tabular Matrix drill downs
When i drill down in a powerbi report or when i run the extracted Dax query from dax studio i get massive performance hits for certain queries (takes 10-12 seconds). I have tested these queries using a clear cache query and also monitored the Server the tabular model is deployed on and i see hardly any change to CPU\Memory during this execution?
The CPU never goes above 19% and the memory usage never goes above 14% when running a big query.
Is there a missing setting i have missed during setup?
My Models stats:
The model is in tabular 2019 on the fully upgraded 2019 SQL Server. The tabular model is a star schema with no RI violations and it only has columns that it requires (no bidirectional relationships) with 2 main fact tables. It is deployed on a standard edition SSAS edition and the tabular model is 180.33 MBs large.
The SSAS server has the default configuration, and the server it is installed upon has 1 socket with 16 virtual processors with a maximum speed of 2.10 GHz and memory of 47 gb.
The queries i run are quite large, on a fact table with 10 million rows.
An example of one of the heaver measures in the matrix is:
Measure 1:
Prebilling Rev = [Open Revenue] + [Prod]
Measure 2:
[Prod] = Calculate(Sum('Revenue History'[Revenue]), 'Transaction Types'[Production_Disbursments_Measure_Flag] =1)
Measure 3:
Open Revenue= [Revenue Close] - Sum('Wip History'[WIP])
Measure 4:
Revenue Close = CALCULATE(sum('Revenue History'[Revenue]),FILTER(ALL('Periods'),Periods[Period_ID]<= MAX (Periods[Period_ID] ))
4 answers
Sort by: Most helpful
-
Lukas Yu -MSFT 5,821 Reputation points
2020-08-20T02:13:08.773+00:00 -
Lukas Yu -MSFT 5,821 Reputation points
2020-08-21T07:52:30.533+00:00 Hi,
Regarding to the hardware usage you concerned, there are two things come across my mind:- What is the server configuration of your SSAS tabular server. You could check this by right click it for server property in SSMS. Pay special attention to the memory setting. See reference : Server property categories
- There are some limitation of SSAS standard edition, since you have relatively big model. You may consider to up grade the edition to enterprise edition which allow unlimited RAM usage. For standard the maxim is 16GB. See : Scale limit
Or you could try OLAP cube which have bigger RAM/CPU allowance with standard edition.
Regards,
Lukas -
Datacrunk 1 Reputation point
2020-08-20T03:57:10.36+00:00 My goal is to get my matrix queries to go from 30 seconds to something a little faster. My question is to figure out if i have messed up or missed something in configuration that might make my tabular perform poorly, or if this is normal tabular behaviour.
I was thinking it might be something wrong with my server or SSAS installation. I guess what you are saying it may be a simple limitation of tabular models. If that is the case, it is okay, i will look into Olap cubes.
I ran my most complicated dax query (one of the billing reports with a matrix and 12 measures) which takes 30 seconds to query the whole matrix (drilled down to the bottom). It takes 30 seconds (SE CPU = 9689 ms x 1.7) and FE is (24488) and SE is 5595. SE queries = 33. I spent weeks trying to get the measures and tabular models to perform better. But it seems like 5-8 seconds per matrix drill down or 30 seconds to expand the whole matrix is pretty normal.
My CPU usage is very very low which is strange, kind of thinking its either tabular or there is a setting with my CPU that is incorrect
-
Datacrunk 1 Reputation point
2020-08-22T23:01:04.67+00:00 Hi @Lukas Yu -MSFT ,
Thanks for the reply,
I did invest a good couple of hours investigating\editing and testing changes to configuration but it did not make any large impact on the models speed and is currently set as the default out of the box settings.
My model is currently 90MB, however the fact table is 14 million large. Interesting enough, i removed some columns and aggregated the model down from 14 million to only 3.5 million to test to see if the tabular model would run the same matrix\dax queries faster and it was only SLIGHTLY faster and while looking at the resource usage of the server during this query which took 28 odd seconds, it wasnt touching CPU or Memory.
I was thinking about upgrading editions from standard, however i currently have 46 gb of ram in total with 16gb of ram for the standard instance and the model is hardly using any memory during any calculations (hardly see it go up by even 1 percent ), so i am hesitant to upgrading the edition considering the fact that it is not actually using the RAM or 16 core of CPU during the queries at the moment. Also weighing up the cost of this edition license upgrade to the fact that upgrading might not actually make the model run faster is worrying.
I think i might have to learn MDX and OLAP Cubes which is frustrating as i thought tabular was a replacement for small models like my 14 million fact table model.
Thanks for your advice on this question. It is much appreciated