@Azure Enthusiast - Thanks for the question and using MS Q&A platform.
Yes, you can create a Synapse Serverless SQL Pool External Table using a Databricks Notebook. You can use the Synapse Spark connector to connect to your Synapse workspace and execute the CREATE EXTERNAL TABLE statement.
Here is an example code snippet to create an external table in Synapse Serverless SQL Pool from a Databricks Notebook:
# Set up the Synapse Spark connector configuration
spark.conf.set(
"spark.sql.synapse.workspace.name",
"<your-synapse-workspace-name>"
)
spark.conf.set(
"spark.sql.synapse.linkedService",
"<your-synapse-linked-service-name>"
)
spark.conf.set(
"spark.sql.synapse.synapseSqlPool",
"<your-synapse-sql-pool-name>"
)
# Define the external table schema
externalTableSchema = "col1 INT, col2 STRING, col3 DOUBLE"
# Define the external table options
externalTableOptions = {
"location": "<your-external-table-location>",
"data_source": "<your-external-data-source-name>",
"file_format": "<your-external-file-format-name>"
}
# Create the external table in Synapse Serverless SQL Pool
spark.sql(f"CREATE EXTERNAL TABLE <your-external-table-name> ({externalTableSchema}) USING SYNAPSE {json.dumps(externalTableOptions)}")
Replace the placeholders with your own values and run the code in a Databricks Notebook to create the external table in Synapse Serverless SQL Pool.
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.