Synapse Spark "Request exceeds throttling limits"
I'm using a Synapse Spark Notebook to sequentially integrate multiple parquet files into a delta table.
I'm running into the following issue:
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Metadata service API com.microsoft.catalog.metastore.sasClient.SasClient$API@3219edf6 failed with status -1 (null) Response Body ({"result":"DependencyError","errorId":"TooManyRequests","errorMessage":"BBCServiceException is [Request exceeds throttling limits: PerClusterNodesReads1Min=15, throttlingKey=1133da19-6380-42e5-82b3-b09287fb1fc0; Retry after 5 seconds.]. TraceId : 015adc1c-c4b6-4d24-b2db-16cedb950c59. Error Component : BBC"}))
Where is the throttiling limit defined? Can it be adjusted? Is there any way to mitigate the issue?