How to monitor cube process using Azure loganalytics
I want check the Full cube refresh is processing or not and particular table count using loganalytics.If anyone knows please help me
Azure Monitor
-
bharathn-msft • 5,101 Reputation points • Microsoft Employee
2021-08-19T03:17:08.45+00:00 @Jhansi Nallamothu - Thank you for reaching out with your query, can you please elaborate your scenario so that we can help you accordingly.
Are you referring to server metrics or perf counters ? below are some of the documentations pertaining to server metrics and performance counters.
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-monitor#server-metrics
https://learn.microsoft.com/en-us/analysis-services/instances/performance-counters-ssas?view=asallproducts-allversionsPlease share additional information so that we can help you accordingly.
-
Jhansi Nallamothu • 1 Reputation point
2021-08-19T13:20:48.347+00:00 @bharathn-msft we are having some issue with memory so we are doing cube refresh manually after that we automate that process in ADF. when I did manual process getting huge records in one table .I want to monitor those table using log analytics when the job is executing.
-
HimanshuSinha-msft • 19,476 Reputation points • Microsoft Employee
2021-08-19T22:37:29.777+00:00 Hi @Jhansi Nallamothu ,
I am assuming you are facing memory issues while refreshing the cube and you want to monitor the different metrics ? As @bharathn-msft pointed out we do have some metrics which is captured and you can use them . Log analytics will log the metrics data and not the name of the fact/dimension table which is being processed .If I were I could have intiated the refresh on the table level ( you can do this using powershell or ADF ( using web activity ) ( here is API to call )and then corelate the same with the metrics data .
Let me know how it goes .
Thanks
Himanshu -
Jhansi Nallamothu • 1 Reputation point
2021-08-26T08:34:25.67+00:00 @HimanshuSinha-msft I am doing cube refresh using ADF but sometimes I am getting an error like connection problems to avoid such connection issues what actions do I need to take
Operation on target GetAzureRefreshStatus failed: {"code":"InternalError","subCode":0,"message":"An internal error occurred.","timeStamp":"2021-08-26T08:05:17.2362427Z","httpStatusCode":500,"details":[{"code":"RootActivityId","message":"f7dc162b-a07f-44f0-9395-fd83bc70dff0"},{"code":"Param1","message":"A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)"}]}
-
Jhansi Nallamothu • 1 Reputation point
2021-09-02T17:09:24.01+00:00 @HimanshuSinha-msft could you please provide update for my question
-
HimanshuSinha-msft • 19,476 Reputation points • Microsoft Employee
2021-09-02T18:27:44.18+00:00 Hello @Jhansi Nallamothu ,
My sincere apoloziges for the delay on my side .
You mentioned "sometimes I am getting an error " , good to know that its working most of the ttimes :)This is what I think .
- Are you guys pausing the AAS at times ( I know many do to reduce the cost for the AAS ) ? Since we are geting the status code of 500 , which means that the request is reaching the server and there is some error which is generated on the AAS server .
- On other thing which you can try to use a retry option in Web activity , but unfortunately web activity does not have that option . But you can create one . Use a foreach loop with a range of 1-5 and inside the FE loop add the web activity . Use a toggle flag ( use variable ) . Also add a wait activity and set that accordingly .
Thanks
Himanshu -
HimanshuSinha-msft • 19,476 Reputation points • Microsoft Employee
2021-09-03T17:39:30.707+00:00 Hello @Jhansi Nallamothu ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .In case if you have any resolution please do share that same with the community as it can be helpful to others . Otherwise, will respond back with the more details and we will try to help .
Thanks
Himanshu
Sign in to comment