Accessing Fabric Lakehouse via Logic App

JC 10 Reputation points
2024-07-10T02:35:07.8866667+00:00

Has anyone had any luck connecting a logic app to a lakehouse in Microsoft Fabric?

Some additional background:

We have a data pipeline that is ultimately generating a summary of some actions taken within the pipeline. This summary is stored as an Excel spreadsheet on our lakehouse/Files. Within our Data Factory pipeline, one of the lasts tasks is to email this report. Unfortunately, while Data Factory has an email activity, it does not support attachments. So our workaround was to create a logic app whereby we could post a request from Data Factory, parse the JSON, retrieve the file from the Fabric lakehouse, and then email that file to the appropriate group.

This approach unfortunately fails when utilizing Get Blob Content (V2). The error generated is:

Provided storage account endpoint doesn't follow the naming convention. Storage Account name should be between 3 and 24 characters (lowercase letters and numbers). If providing the Storage Account endpoint, make sure to enter the full URI (for example https://accountname.blob.storage.net)

We tried the abfs path Fabric utilizes:

abfss://<workspace>@onelake.dfs.fabric.microsoft.com/<lakehouse_name>.Lakehouse/Files

And we tried the BLOB URL:

https://onelake.blob.fabric.microsoft.com/<workspace>/<lakehouse_name>.Lakehouse/Files/

And then the DFS URL (because why not):

https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse_name>.Lakehouse/Files

Nothing works... And the odd thing, Azure Storage Explorer has no problem accessing the underlying ADLS Gen2 storage account (the lakehouse). So I'm assuming this is a problem with the way this particular action works within Logic Apps. Looks like it's attempting to validate the URI, but Fabric uses a slightly different endpoint naming convention (even though it's just ADLS Gen2) and Logic Apps just has not caught up?

This is our first time utilizing Logic Apps, so this might be a dense question. We did not find any other actions specifically for Fabric, so we are a bit at a loss.

Thanks for any help/guidance you can provide!

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,491 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,235 questions
Azure Logic Apps
Azure Logic Apps
An Azure service that automates the access and use of data across clouds without writing code.
3,221 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,873 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Sumarigo-MSFT 47,106 Reputation points Microsoft Employee
    2024-07-10T06:54:33.51+00:00

    @JC Welcome to Microsoft Q&A Forum, Thank you for posting your query here!

    Welcome to Microsoft Q&A Platform, thanks for posting your query here. Azure Fabric related queries/issues are currently not supported on this Microsoft Q&A platform.

    I would request you to please post your queries in dedicated forums as in below links:

    https://community.fabric.microsoft.com/

    Additional information : However let me share some insights

    It sounds like you are trying to use a Logic App to retrieve a file from a lakehouse in Microsoft Fabric and email it as an attachment. However, you are encountering an error when using the Get Blob Content (V2) action in the Logic App.

    Based on the error message you provided, it seems that the Logic App is having trouble validating the storage account endpoint. This may be due to the fact that Microsoft Fabric uses a slightly different endpoint naming convention than other Azure services.

    To resolve this issue, you may want to try using the Azure Data Lake Storage Gen2 connector in the Logic App instead of the Get Blob Content (V2) action. The Azure Data Lake Storage Gen2 connector is specifically designed to work with Azure Data Lake Storage Gen2, which is the underlying storage technology used by Microsoft Fabric.

    Here are the high-level steps to implement this solution:

    1. Create a new Logic App in the Azure portal.
    2. Add a trigger to the Logic App that is appropriate for your scenario (e.g., HTTP request, schedule, etc.).
    3. Add an action to the Logic App to retrieve the file from the lakehouse using the Azure Data Lake Storage Gen2 connector.
    4. Add an action to the Logic App to email the file as an attachment.

    Here is some sample code that demonstrates how to retrieve a file from a lakehouse using the Azure Data Lake Storage Gen2 connector in a Logic App:

    {
        "actions": {
            "Get_file_content": {
                "inputs": {
                    "host": {
                        "connection": {
                            "name": "@parameters('$connections')['azuredataLakeStorageGen2']['connectionId']"
                        }
                    },
                    "method": "get",
                    "path": "/v2/<lakehouse_name>.Lakehouse/Files/<file_path>",
                    "queries": {
                        "upn": "<user_principal_name>"
                    }
                },
                "runAfter": {},
                "type": "Http"
            },
            "Send_email": {
                "inputs": {
                    "attachments": [
                        {
                            "contentBytes": "@body('Get_file_content')",
                            "name": "<file_name>"
                        }
                    ],
                    "body": {
                        "Body": {
                            "ContentType": "HTML",
                            "Content": "<email_body>"
                        },
                        "Subject": "<email_subject>"
                    },
                    "from": {
                        "address": "<from_address>",
                        "name
    

    Please let us know if you have any further queries. I’m happy to assist you further.   


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.