Weigh the performance impact

Completed

Power BI can connect to the web services that Business Central has published. These webservices are based upon page or query objects. The main difference between a page webservice and a query webservice is:

Page

  • Can use one source table

  • Can contain AL logic and calculated columns

  • Can contain flow fields

  • Cached on the service tier

  • Always executes a SELECT * on the underlying source table, independent of the defined columns in the page object

Query

  • Can join multiple source tables

  • Can calculate aggregations

  • Can't contain any AL logic or calculated columns

  • Can contain flow fields

  • Not cached on the service tier

  • Only SELECTs the columns that are defined as columns in the query object

API Query

  • Can join multiple source tables (faster than API Page)

  • Can calculate aggregations

  • Can't contain any AL logic or calculated columns

  • Can contain flow fields

  • Not cached on the service tier

  • Only SELECTs the columns that are defined as columns in the query object

  • Doesn't need to be published in the Web Services table

As you might have experienced, the performance of query objects is usually better than the performance of page objects. Which is why it's recommended to create or use query objects as data sources for Power BI. Even when you require calculated columns, that's something you can also define in Power BI Desktop. Do it by creating calculated columns or measures after you import the data. It's recommended to only use page objects as data sets for Power BI if you require a certain calculation or execution of business logic using AL code that you can't perform in Power BI Desktop after data is imported.

Efficient extracts to data lakes or data warehouses

When establishing a data lake or a data warehouse, you typically need to do two types of data extraction:

  • A historical load (all data from a given point-in-time)

  • Delta loads (what's changed since the historical load)

The fastest (and least disruptive) way to get a historical load from Business Central online is to get a database export as a BACPAC file (using the Business Central admin center) and restore it in an Azure SQL Database or on a SQL Server. For on-premises installations, you can just take a backup of the tenant database.

The fastest (and least disruptive) way to get delta loads from Business Central online is to set up API queries configured with read-scaleout and use the data audit field LastModifiedOn (introduced in version 17.0) on filters.

Writing efficient web services

Business Central supports Web services to make it easier to integrate with external systems. As a developer, you need to think about performance of web services for both the Business Central server (the endpoint) and the consumer (the client).

Avoid using standard UI pages to expose as web service endpoints. Many things, like FactBoxes, aren't exposed in OData, but use resources to compute.

Things that have historically caused performance on pages that are exposed as endpoints are:

  • Heavy logic in OnAfterGetCurrRecord

  • Many SIFT fields

  • FactBoxes

Using Read Scale-Out

Business Central supports the Read Scale-Out feature in Azure SQL Database and SQL Server. Read Scale-Out is used to load balance analytical workloads in the database that only read data. Read Scale-Out is built in as part of Business Central online, but it can also be enabled for on-premises versions.

Read Scale-Out applies to queries, reports, or API pages. With these objects, instead of sharing the primary, they can be set up to run against a read-only replica. This setup essentially isolates them from the main read-write workload so that they don't affect the performance of business processes.

As a developer, you control Read Scale-Out on reports, API pages, and query objects by using the DataAccessControl property.

The DataAccessIntent property sets whether to get data for the object from a read-only replica of the database or the primary database.

The property has the following values:

  • ReadOnly - Specifies to get the data from a read-only replica.

  • ReadWrite - Specifies to get the data from the primary database. ReadWrite is the default value.

For reports, API pages, and queries, the Business Central server can use read-only database replicas on Azure SQL Database and SQL Server. If replicas are enabled, use this property to reduce the load on the primary database.

Using ReadOnly might also improve performance when viewing objects. ReadOnly works as a hint for the server to route the connection to a secondary (read-only) replica, if one is available. When a workload is executed against the replica, insert/delete/modify operations aren't possible. If any of these operations are executed against the replica, an exception is thrown at runtime.

From the client, the property value can be overwritten by using page 9880 Database Access Intent List page.

Performance tips for working with Power BI and Business Central

Keeping in mind the previous information, you probably understand it's important to keep an eye on performance, especially when creating Power BI reports in Business Central. Here are some things to consider.

  • It's better to create a new query or a new page and use it as a dedicated data source for Power BI instead of using existing pages or queries that also serve other purposes. By tailoring your dataset (page/query) for your specific Power BI report, you only need to include the fields you need, nothing more.

  • In a situation when you can choose, you should choose queries over pages as data sources for Power BI. Query objects generate a better, faster select statement on the SQL database and query objects can also aggregate data. You should create the queries so that they only return the data that is required for the Power BI report, nothing more. For example, if your report required sales data per day, then create a query object where you fetch the sales data and aggregate it by day. Instead of fetching multiple records for a particular day and then letting Power BI aggregate the records after import.

  • Implement filters in queries and pages. If there are records in the underlying source tables that aren't required in your report, then you should filter them out as soon as possible.

  • Don't include fields or columns that aren't required in your dataset. You can always add them later, when required, and the web service automatically updates.

  • If you require information from Ledger Entry tables, make sure you aggregate the records. Ledger Entry tables contain many records and usually keep on growing. It's highly unlikely that you need to import and visualize every individual ledger entry in a Power BI report. So create the query with the level of aggregation that matches the requirements for the report.

As a recommendation, I would suggest using query objects instead of page objects. It's because query objects generally generate faster SQL statements, they can join multiple tables and aggregate data.

Page objects allow you to calculate fields using the AL programming language, while query objects can't do that. However, you should know that Power BI is also capable of creating calculated columns and measures using the DAX language. So instead of creating a page object with complex calculations in AL-code, it's likely better to create a query object and let Power BI perform the complex calculations, after the data is imported.