Share via

Query output limits! (LAWS, PBI)

AshQs 20 Reputation points
Jul 12, 2023, 12:43 PM

Hello community,

 

I am able to successuflly connect Power BI to Log Analytics Workspace via MQuery (Blank query). The problem is that the query output of LAWS is limited (based on # of cols). For a dataset with ~50k rows and 20+ cols, LAWS can only display ~15k rows, and similarly, Power BI is only importing the same amount. 

 

1 - Is there a way to remove such restriction and import all the rows directly from M-Query ?

2 - Is there any workaround ? I tried playing with "limit ####" but no luck.

 

I have been researching and read that this might be a limitation of LAWS and might require an ADX cluster or an alternative to get all the data in the log tables over to Power BI. Please shed some light and confirm the possibility of accomplishing this via M-query.

 

Thanks. 

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,335 questions
0 comments No comments
{count} votes

Accepted answer
  1. AnuragSingh-MSFT 21,386 Reputation points
    Jul 18, 2023, 6:25 AM

    AshQs, thank you for the reply. Yes, as mentioned in the service limit, the only way forward is to "reduce results by using query scope, time range, and filters in the query."


    Also, note that PowerBI is integrated with log analytics. It’s easy to create your PowerBI dataset directly from logs – please see this article: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-powerbi

    The PowerBI integration for Logs works with Logs API – therefore the limitation is not the UI limitation of 30K results but the API limitation of 500k results, therefore this should help with your scenario.

    If you need more then 500k results per dataset run, there are a host of solutions you can use:

    1. Using PowerBI dataflows and incremental refresh to break the query and incrementally refresh it - https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-powerbi#collect-data-with-power-bi-dataflows
    2. Using Microsoft Power automate (https://powerautomate.microsoft.com/en-us/) or Azure (https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-overview) to run periodic refresh and write back to Logs

    The following link could also help you, which shows steps to query LA workspace data as an ADX cluster - https://learn.microsoft.com/en-us/azure/data-explorer/query-monitor-data


    You are right, with the service limits in place, if you want to import all the data to PBI dataset using MQuery, going through the ADX cluster/Azure storage would be a good option. For more details, see Log Analytics - Other Export options. Note that there are 2 steps that will have to be done in this case -

    1. Export historical data first, in case you need the existing data from LA workspace
    2. Setup continuous export, using Diagnostic settings, so that future data also gets to the destination.

    On a side note, I would suggest using the query feature of LA workspace to process, transform, aggregate/summarize the data in query to only import the required content to PBI Dataset. This might reduce the output record count and no additional infrastructure would be required to set up.

    I hope this helps.

    2 people found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Andrew Blumhardt 9,866 Reputation points Microsoft Employee
    Jul 12, 2023, 2:24 PM

    Maybe I am misunderstanding but I would expect PowerBI to perform queries against logs. The results should be much smaller in row/column count (rather than re-ingesting the data). For example, a performance report would render the binned values rather that pull over the full dataset. I am not clear on a use case where you would need 50K+ rows in PowerBI.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.