Hello @ICHAN
Thanks for the question and using MS Q&A platform.
You can create an external partitioned table using the parquet format from blob storage in Azure Databricks by using the following steps:
- Access the blob storage container to Databricks file system (DBFS).
- Create a table in Databricks using the parquet format and pointing to the mounted blob storage container.
As per the repro, I used the simple method i.e. Access Azure Blob storage using the DataFrame API and then create a table using the parequet format as shown below:
Here is an example of how to create an external partitioned table using the parquet format from blob storage in Databricks:
CREATE TABLE IF NOT EXISTS table_1 (
<column_1_name> <column_1_data_type>,
<column_2_name> <column_2_data_type>,
...
)
USING PARQUET
PARTITIONED BY (part string)
OPTIONS (path '/somelocation/TABLE_1/')
Hope this helps. Do let us know if you any further queries.
Please don’t forget to Accept Answer
wherever the information provided helps you, this can be beneficial to other community members.