Looks like Read append blob is not supported in azure Databricks and synapse, pls see the link below where this has been answered and accepted earlier.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Hi All I am trying to read a telemetry log , which is stored currently in blob storage without the hierarchical namespace enabled , when I try to read , I am getting the following error.
All my files in storage account are currently saved as append blob, when I try to read the append blob in data lake gen 2 with hierarchical namespace enabled also I am getting similar kind of error.
Please help , thanks in advance.
Exception: Incorrect Blob type, please use the correct Blob type to access a blob on the server. Expected BLOCK_BLOB, actual APPEND_BLOB.
Looks like Read append blob is not supported in azure Databricks and synapse, pls see the link below where this has been answered and accepted earlier.
@Bloody Data Welcome to Microsoft Q&A Forum, Thank you for posting your query here!
An append blob is composed of blocks and is optimized for append operations. When you modify an append blob, blocks are added to the end of the blob only, via the Append Block operation. Updating or deleting of existing blocks is not supported. Unlike a block blob, an append blob does not expose its block IDs.
There is similar thread discussion, please refer to the suggestion mentioned here: How to Read Append Blobs as DataFrames in Azure DataBricks
You can also use the Azure Storage explorer tool : https://azure.microsoft.com/en-us/products/storage/storage-explorer
Please let us know if you have any further queries. I’m happy to assist you further.
Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.
Encountered exactly the same issue using mounts in Databricks.
However it seems it uses a different driver, I managed to read append blobs with a Databricks notebook using abfss.
You can create an external table with an abfss location and then load it in any dataframe.
Simply follow the basic guide from Databricks to access Blob storage : https://docs.databricks.com/storage/azure-storage.html#language-Account%C2%A0key
Hope this helps