How to fix the request throttles, quto exceeded in the Azure Data pipeline built through data flows

Rohan Dussa 0 Reputation points

I have created a data flow using the source as a REST connector and sink as Azure SQL database. The linked service has pages in the URL for which I have applied the pagination rule. However, I am getting the below error when I run the pipeline for the data flow: Operation on target KeapContactData failed: {"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: Failure to read most recent page request:","Details":" DF-Rest_015 - Failure to read most recent page request: DF-REST_001 - Error response from server: Some({\n "code": "429",\n "message": "Quota Exceeded",\n "status": "Request Throttled",\n "details": null\n }), Status code: 429. Please check your request url and body. (url:,request body: None, request method: GET)\n\tat$.failure(Utils.scala:76)\n\tat$anonfun$readResourcesWithDynamicPaging$1(RestClient.scala:88)\n\tat scala.util.Try$.apply(Try.scala:213)\n\tat\n\tat\n\tat\n\tat org.apache.spark.rdd.R"} From the description, I understand that the error is regarding the quota and request limit. However I am not able to resolve the issue. I have set the request interval time to 16000 ms to allow the service to navigate through pages.

I appreciate the help

Azure SQL Database
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,068 questions
{count} votes