write dataframe to delta parquet

arkiboys 9,366 Reputation points
2022-07-02T14:38:03.827+00:00

Hello,
I know how to read the delta parquet files in to a dataframe
example:
df_delta = spark.read.format("delta").load("...folder_path...")

Question:
How is it possible to write the result of a dataframe into delta parquet files ?
example?
df.write.format("delta").mode("overwrite").save(f"abfss://curated@{storage_account_name}.dfs.core.windows.net/folderpath")

The above line gives error as:

AnalysisException Traceback (most recent call last)
<command-3137735205505607> in <module>
7
8 df = spark.sql(strSQL)
----> 9 df.write.format("delta").mode("overwrite").save(f"abfss://curated@{storage_account_name}.dfs.core.windows.net/folderpath")

/databricks/spark/python/pyspark/sql/readwriter.py in save(self, path, format, mode, partitionBy, **options)
738 self._jwrite.save()
739 else:
--> 740 self._jwrite.save(path)
741
742 @since(1.4)

/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in call(self, *args)
1302
1303 answer = self.gateway_client.send_command(command)
-> 1304 return_value = get_return_value(
1305 answer, self.gateway_client, self.target_id, self.name)
1306

Thank you

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,843 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. arkiboys 9,366 Reputation points
    2022-07-03T09:32:50.69+00:00

    solved as the cause of having brackets which is not allowed in parquet.
    Thanks

    1 person found this answer helpful.