How to create Synapse Serverless SQL Pool External Table using Databricks Notebook?

Azure Enthusiast 0 Reputation points
2023-07-24T20:04:56.9766667+00:00

Hello,

Can we create Synapse Serverless SQL Pool External Table using a Databricks Notebook?

E.g., the script to create an Synapse Serverless SQL Pool External Table from within Synapse is as follows:

CREATE EXTERNAL TABLE [SchemaName].[TableName] (
Table Specifications - Column Name, Datatypes etc.
     )
WITH (
LOCATION = [Delta Lake Table Location - ADLS Location],
DATA_SOURCE = [External Data Source Name],
FILE_FORMAT = [External File Format]
)

How do we run the above script from Databricks Notebook to create the Serverless SQL Pool External Table?

Many Thanks in advance!

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,115 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,845 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Vinodh247 9,071 Reputation points
    2023-07-25T05:02:07.4+00:00

    Hi,

    Thanks for reaching out to Microsoft Q&A.

    pls see this answer from pradeep, he has successfully connected and created external table.., can you confirm if you have tried this already?

    https://stackoverflow.com/questions/70161819/is-it-possible-to-connect-to-serverless-sql-pool-via-azure-databricks

    Please Upvote and Accept as answer if the reply was helpful, this will be benefitting the other community members who go through the same issue.

    0 comments No comments

  2. PRADEEPCHEEKATLA-MSFT 73,651 Reputation points Microsoft Employee
    2023-07-25T06:11:38.67+00:00

    @Azure Enthusiast - Thanks for the question and using MS Q&A platform.

    Yes, you can create a Synapse Serverless SQL Pool External Table using a Databricks Notebook. You can use the Synapse Spark connector to connect to your Synapse workspace and execute the CREATE EXTERNAL TABLE statement.

    Here is an example code snippet to create an external table in Synapse Serverless SQL Pool from a Databricks Notebook:

    # Set up the Synapse Spark connector configuration
    spark.conf.set(
      "spark.sql.synapse.workspace.name",
      "<your-synapse-workspace-name>"
    )
    spark.conf.set(
      "spark.sql.synapse.linkedService",
      "<your-synapse-linked-service-name>"
    )
    spark.conf.set(
      "spark.sql.synapse.synapseSqlPool",
      "<your-synapse-sql-pool-name>"
    )
    
    # Define the external table schema
    externalTableSchema = "col1 INT, col2 STRING, col3 DOUBLE"
    
    # Define the external table options
    externalTableOptions = {
      "location": "<your-external-table-location>",
      "data_source": "<your-external-data-source-name>",
      "file_format": "<your-external-file-format-name>"
    }
    
    # Create the external table in Synapse Serverless SQL Pool
    spark.sql(f"CREATE EXTERNAL TABLE <your-external-table-name> ({externalTableSchema}) USING SYNAPSE {json.dumps(externalTableOptions)}")
    

    Replace the placeholders with your own values and run the code in a Databricks Notebook to create the external table in Synapse Serverless SQL Pool.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.