Synapse Spark "Request exceeds throttling limits"

Julian Breunung 22 Reputation points
2022-08-19T14:06:21.483+00:00

I'm using a Synapse Spark Notebook to sequentially integrate multiple parquet files into a delta table.
I'm running into the following issue:

AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Metadata service API com.microsoft.catalog.metastore.sasClient.SasClient$API@3219edf6 failed with status -1 (null) Response Body ({"result":"DependencyError","errorId":"TooManyRequests","errorMessage":"BBCServiceException is [Request exceeds throttling limits: PerClusterNodesReads1Min=15, throttlingKey=1133da19-6380-42e5-82b3-b09287fb1fc0; Retry after 5 seconds.]. TraceId : 015adc1c-c4b6-4d24-b2db-16cedb950c59. Error Component : BBC"}))

Where is the throttiling limit defined? Can it be adjusted? Is there any way to mitigate the issue?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,697 questions
{count} votes