Share via


Tutorial: Create and update FinOps hubs

In this tutorial, you learn how to create a new or update an existing FinOps hub instance in Azure or Microsoft Fabric. The tutorial walks through deployment options and decisions that need to be made as you set up and configure FinOps hubs. This article helps you:

  • Apply FinOps hubs prerequisites.
  • Create a new or update an existing FinOps hub instance.
  • Ingest and backfill data in FinOps hubs.
  • Connect your hub to Microsoft Fabric.
  • Create reports and dashboards.

Prerequisites

  • Access to an active Azure subscription with permissions to deploy the FinOps hubs template:
  • Access to one or more supported Enterprise Agreement (EA), Microsoft Customer Agreement (MCA), or Microsoft Partner Agreement (MPA) scope in Cost Management to configure exports:
    • Subscriptions and resource groups: Cost Management Contributor.
    • EA billing scopes: Enterprise Reader, Department Reader, or Account Owner (also known as enrollment account).
    • MCA billing scopes: Contributor on the billing account, billing profile, or invoice section.
    • MPA billing scopes: Contributor on the billing account, billing profile, or customer.
  • Optional: Access to Power BI or a Microsoft Fabric workspace with Contributor or Member permissions to create resources and publish reports.
  • Optional: PowerShell 7 or Azure Cloud Shell with the FinOps toolkit PowerShell module installed and imported.

More permissions are covered as part of the tutorial.


Enable required resource providers

FinOps hubs use Cost Management to export data and Event Grid to know when data is added to your storage account. Before deploying the template, you need to register the Microsoft.CostManagementExports and Microsoft.EventGrid resource providers.

  1. From the Azure portal, open the list of subscriptions.
  2. Select the subscription to use for your FinOps hub deployment.
  3. In the left menu, select Settings > Resource providers.
  4. In the list of resource providers, find the row for Microsoft.EventGrid.
  5. If the Status column shows Not Registered, select the context menu to the right of the provider name (⋅⋅⋅) and then select Register.
  6. Repeat steps 4-5 for Microsoft.CostManagementExports.

Plan your network architecture

Do you prefer public or private network routing?

Public routing is most common and easiest to use. Resources are reachable from the open internet. Access is controlled via role-based access control (RBAC). Public routing doesn't require configuration.

Do you prefer public or private network routing?

  • Public routing is most common, easiest to use, and makes resources reachable from the open internet.
  • Private routing is most secure, comes with added cost, and makes resources only reachable from peered networks.

Public routing doesn't require configuration. If you opt for private routing, work with your network admin to configure peering and routing so the FinOps hubs isolated network is reachable from your network. Before you decide, learn more about the extra configuration steps required in Configure private networking.


Optional: Set up Microsoft Fabric

Many organizations adopt Microsoft Fabric as a unified data platform to streamline data analytics, storage, and processing. FinOps hubs can use Microsoft Fabric Real-Time Intelligence (RTI) as either a primary or secondary data store. This section only applies when configuring Microsoft Fabric as a primary data store instead of Azure Data Explorer.

Configuring Microsoft Fabric is a manual process and requires explicit steps before and after template deployment. This section covers the initial setup requirements.

  1. Create a workspace and eventhouse:
    1. From Microsoft Fabric, open the desired workspace or create a new workspace. Learn more.
    2. From your Fabric workspace, select the + New item command at the top of the page.
    3. Select Store data > Eventhouse.
    4. Specify a name (for example, FinOpsHub) and select Create.
  2. Create and configure the Ingestion database:
    1. Select Eventhouse > + Database at the top of the page, set the name to Ingestion, and select Create.
    2. Select the Ingestion_queryset in the left menu.
    3. Delete all text in the file.
    4. Download and open the finops-hub-fabric-setup-Ingestion.kql file in a text editor.
    5. Copy the entire text from this file into the Fabric queryset editor.
    6. Press Ctrl+H to trigger the find and replace dialog, set the find text to $$rawRetentionInDays$$, and replace it with 0 or desired number of days to keep data in _raw tables, then press Ctrl+Alt+Enter to replace all instances.
    7. Press Ctrl+Home to bring the cursor to the beginning of the file and press Shift+Enter or select the Run command at the top of the page.
    8. Wait for the script to complete and then review the Result column to confirm all commands completed successfully.
      • If you see an error for a line that has $$rawRetentionInDays$$, repeat steps 2.6 and 2.7.
      • If you experience a different error, create an issue in GitHub.
  3. Repeat step 2 for the Hub database using the finops-hub-fabric-setup-Hub.kql file script file.
  4. In the left pane, select System overview, then select the Copy URI link for the Query URI property in the details pane on the right.
    • Make note of the query URI. You'll use it in the next step.

Deploy the FinOps hub template

The core engine for FinOps hubs is deployed via an Azure Resource Manager deployment template. The template is available in bicep. The template includes a storage account, Azure Data Factory, Azure Data Explorer, and other supporting resources. To learn more about the template and least-privileged access requirements, refer to the FinOps hub template details.

  1. Open the desired template in the Azure portal:
  2. Select the desired subscription and resource group.
  3. Select an Azure region where you would like to deploy resources to.
    • If connecting to Microsoft Fabric, select the same region as your Fabric capacity. You can find the region in your workspace settings > License info > License capacity.
  4. Specify a hub name used for core resources and reporting purposes.
    • All resources have a common cm-resource-parent tag to group them together under the hub in Cost Management.
  5. Specify a unique Azure Data Explorer cluster name or the Microsoft Fabric eventhouse Query URI.
    • This name is used to query data and connect to reports, dashboards, and other tools.
    • If deploying to Microsoft Fabric, use your Fabric eventhouse query URI and leave the Data Explorer cluster name empty.
    • Data Explorer and Fabric are optional, but recommended if monitoring more than $100,000 in total spend.
    • Warning: Power BI may experience timeouts and data refresh issues if relying on storage for more than $1 million in spend. If you experience issues, redeploy with Data Explorer or Microsoft Fabric.
  6. Select the Next button at the bottom of the form.
  7. If desired, you can change the storage redundancy or Data Explorer SKU.
    • We don't recommend changing either setting for your initial deployment.
    • If using Data Explorer, the storage account is a temporary data store and shouldn't need geo-redundancy.
    • Most deployments doesn't require a larger Data Explorer SKU. We recommend starting with the dev/test cluster and monitoring performance before scaling up or out.
    • For details about scaling Data Explorer, see Select a SKU for your cluster.
  8. Select the Next button at the bottom of the form.
  9. Set the desired data retention periods.
    • Raw data retention refers to data added to Data Explorer, but not normalized into the final tables. Use 0 unless you need to troubleshoot ingestion issues. This number indicates retention in days.
    • Normalized data retention refers to the time frame in months that data is available in the final tables. 0 only keeps the current month, 1 is only last month and the current month, and so on.
  10. Select the Next button at the bottom of the form.
  11. Indicate if you need infrastructure encryption.
    • Not recommended unless you have specific policies requiring infrastructure encryption.
  12. Indicate is you want public or private network routing. Learn more.
  13. If you selected private, specify the desired private network address prefix.
  14. Select the Next button at the bottom of the form.
  15. If desired, specify more tags to add to resources.
  16. Select the Next button at the bottom of the form.
  17. Review the configuration summary and select the Create button at the bottom of the form.

Optional: Configure Fabric access

If you set up Microsoft Fabric as a primary data store, configure access for Data Factory and the Fabric eventhouse.

  1. Get the Data Factory identity:
    1. From the Azure portal, open the FinOps hub resource group.
    2. In the list of resources, select the Data Factory instance.
    3. In the menu on the left, select Settings > Managed identities and copy the Object (principal) ID.
  2. Give Data Factory access to the Hub and Ingestion databases:
    1. From Microsoft Fabric, open the desired workspace and select the target eventhouse.

    2. Select the Ingestion database in the left pane.

    3. Select Ingestion_queryset in the left pane.

    4. Run the following commands separately, replacing <adf-identity-id> with the Data Factory managed identity object ID from step 1:

      .add database Ingestion admins ('aadapp=<adf-identity-id>')
      
      .add database Hub admins ('aadapp=<adf-identity-id>')
      

Configure scopes to monitor

FinOps hubs can monitor any cost and usage dataset that aligns to the FinOps Open Cost and Usage Specification (FOCUS).

You can ingest data from Microsoft Cost Management by creating exports manually or granting access to FinOps hubs to create and manage exports for you. The following steps must be repeated for each scope you need to monitor. We recommend using EA billing accounts and MCA billing profiles for the best coverage and broadest available datasets. To learn more about the difference between manual and managed exports, see Configure scopes.

  1. From the Azure portal, open Cost Management.
  2. Select the desired scope from the scope picker towards the top of the page.
  3. In the menu on the left, select Reporting + analytics > Exports.
  4. Select the Create command.
  5. Select the All costs (FOCUS) + prices template.
  6. Specify a prefix (for example, finops-hub) and select Next at the bottom.
  7. Select the subscription and storage account created by the FinOps hub deployment.
  8. Set the container to msexports.
  9. Set the directory to a unique string that identifies the scope (for example, billingAccounts/###).
  10. Select the Parquet format and Snappy compression for the best performance.
    • Any combination of CSV and parquet, compressed or uncompressed is supported, but snappy parquet is recommended.
  11. Select Next at the bottom.
  12. Review and correct settings as needed and then select Create at the bottom.
  13. Repeat steps 4-12 for any more datasets.
    • Reservation recommendations are required for the Rate optimization report's Reservation recommendations page to load.

Managed exports

Managed exports allow FinOps hubs to set up and maintain Cost Management exports for you. To enable managed exports, you must grant Azure Data Factory access to read data across each scope you want to monitor. For detailed instructions, see Configure managed exports.

Ingest from other data sources

To ingest data from other data providers that support FOCUS, such as Amazon Web Services (AWS), Google Cloud Platform (GCP), Oracle Cloud Infrastructure (OCI), and Tencent:

  1. Configure a FOCUS dataset from your provider.
  2. Create a workflow to copy data into the ingestion container in the FinOps hub storage account.
    • Files are separated by UTC calendar month and should be less than 2 GB each, saved in parquet format. Snappy compression is optional.
    • Files should be placed in the following folder path: Costs/yyyy/mm/{scope}.
      • yyyy represents the four-digit year of the dataset.
      • mm represents the two-digit month of the dataset.
      • {scope} represents a logical, consistent identifier for the dataset. This value can be any valid path using one or more nested folders.
    • If the provider generates nonoverlapping deltas in each dataset, add an extra folder for the day and/or hour (dd or dd/hh) between the month and scope folders.
      • The goal is to ensure that overriding datasets should consistently land in the same folder path so they're overwritten each time. Nonoverlapping datasets should be pushed to a new folder path.
  3. Create an empty manifest.json file in the same folder.
    • Data Explorer ingestion is triggered when manifest.json files are added or updated.
  4. If there are any columns not covered in the current ingestion process, update the Costs_raw and Costs_final_v1_0 tables, and Costs_transform_v1_0, Costs_v1_0, and Costs functions accordingly.
    • Submit a feature request to add new columns to the default ingestion code to ensure customizations don't block future upgrades.

Optional: Populate historical data

FinOps hubs don't automatically backfill data. To populate historical data, run historical data exports from the original data provider, including any custom data pipelines used to publish data into the ingestion storage container.

For Microsoft Cost Management:

  1. From the Azure portal, open Cost Management.
  2. Select the desired scope from the scope picker towards the top of the page.
  3. In the menu on the left, select Reporting + analytics > Exports.
  4. Select the desired export in the list of exports.
    • Always export prices before costs to ensure they're available to populate missing prices in the cost and usage dataset.
    • If costs are exported first, rerun the ingestion_ExecuteETL pipeline for the month's cost data to populate the missing prices.
  5. Select Export selected dates and specify the desired month. Always export the full month.
  6. Repeat step 5 for all desired months.
    • Cost Management only supports exporting up to the last 12 months from the Azure portal.
    • Consider using PowerShell to export beyond the last 12 months.
  7. Repeat steps 4-6 for each export.
  8. Repeat steps 2-7 for each scope.

Optional: Connect to Microsoft Fabric as a follower

If you chose to configure FinOps hubs with Data Explorer, but are still interested in making data available in Microsoft Fabric, create a shortcut (follower) database using Fabric eventhouses. Shortcut databases are not necessary if you ingested directly into a Fabric eventhouse.

  1. From your Fabric workspace, select the + New item command at the top of the page.
  2. Select Store data > Eventhouse.
  3. Specify a name and select Create.
  4. Select + Database at the top of the page.
  5. Set the name to Ingestion and type to New shortcut database (follower), then select Next.
  6. Set the cluster URI to the FinOps hub cluster URI and database to Ingestion, then select Create.
  7. Repeat steps 4-6 for the Hub database.

Configure reports and dashboards

FinOps hubs come with a Data Explorer dashboard and Power BI reports that can connect to data in Data Explorer (via KQL) or in Azure Data Lake Storage.

We recommend setting up the Data Explorer dashboard even if you use Power BI due to the quick and easy setup and insights into ingested data.

  1. Download the dashboard template.
  2. Grant any users Viewer (or greater) access to the Hub and Ingestion databases. Learn more.
  3. Go to Azure Data Explorer dashboards.
  4. Import a new dashboard from the file in step 1.
  5. Edit the dashboard and change the data source to your FinOps hub cluster.

For more information, see Configure Data Explorer dashboards.


Troubleshooting

If you experience a specific error, check the list of common errors for mitigation steps. If you aren't experiencing a specific error code or run into any other issues, refer to the Troubleshooting guide.

If your issue isn't resolved with the troubleshooting guide, see Get support for FinOps toolkit issues for additional help.


Give feedback

Let us know how we're doing with a quick review. We use these reviews to improve and expand FinOps tools and resources.

If you're looking for something specific, vote for an existing or create a new idea. Share ideas with others to get more votes. We focus on ideas with the most votes.


Related FinOps capabilities:

Related products:

Related solutions: