Create Materialized Views via DLT Pipelines

Ashwini Gaikwad 130 Reputation points
2024-11-22T21:22:51.9733333+00:00

I am attempting to create materialized views via DLT pipelines in the "abc" catalog and "gold" schema based on different catalog and schema ("fmv"), which contains 20 tables.

Workspace binding , NSG connections and access part is already achieved between both the catalogs.

However, the DLT pipelines fail during the initialization stage with the following error:

"You attempted to update an empty pipeline. This error usually means no tables were discovered in your specified source code. Please verify your source code contains table definitions."

Is it possible to create materialized views in one catalog and schema based on tables residing under different catalog and schema? If so, what are the steps to achieve this?

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
{count} votes

2 answers

Sort by: Most helpful
  1. AnnuKumari-MSFT 34,556 Reputation points Microsoft Employee Moderator
    2024-11-23T17:56:58.99+00:00

    Hi Ashwini Gaikwad ,

    Thankyou for using Microsoft Q&A platform and thanks for posting your query here.

    Based on the provided information, it seems that you are trying to create materialized views via DLT pipelines . However, the DLT pipelines fail during the initialization stage with the error message "You attempted to update an empty pipeline. This error usually means no tables were discovered in your specified source code. Please verify your source code contains table definitions."

    It is important to ensure that the necessary permissions and access are granted to the user or service principal creating the materialized views. Specifically, Kindly ensure that the service principal running the DLT pipeline has:

    • Read access to the source catalog/schema where the base tables reside.
    • Write access to the target catalog/schema where the materialized views will be created.

    Additionally, kindly make sure to use fully qualified table name.

    Hope it helps. Kindly let us know how it goes. Thankyou


  2. AnnuKumari-MSFT 34,556 Reputation points Microsoft Employee Moderator
    2024-11-28T04:48:33.9233333+00:00

    Hi Ashwini Gaikwad ,

    I got the response from the internal team.

    "Seems like in your code there's no Delta Live Table syntax. DLT pipeline is looking for that syntax to run the pipeline. If your goal is to schedule updating the materialized view, you can take a look here on how to do it: Schedule materialized view refreshes

    Once you schedule it, then you'll be able to see the job/pipeline created in the workflows tab. 

    Note: Currently, all DLT tables must reside in same Catalog.Schema.  However, in a future release, you will be able to put any Table/MV into different catalog/schemas."Hope it helps. Kindly accept the answer by clicking on Accept answer button. Thankyou


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.