Azure Data Factory | Google Big Query | API Limit Execeeded

Pisat, Pritesh 1 Reputation point
2022-10-27T11:32:41.997+00:00

My requirement is to extract hug volume of daily data(~35GB i.e. ~3million parent rows) from Google Big Query using ADF so I am dividing it into chunk load(~127k records per pipeline run) but I am getting below error message while extracting rows more than 625 count.

I am using below query under Source in Copy Data activity, [Note : I am using below customized logic to convert result set into JSON format to create JSON file]

WITH TestResult AS(
SELECT A.*
FROM
(
SELECT RANK() OVER (ORDER BY {Column X} ASC) AS rownum, *
FROM {Schema}.{Table}
WHERE {Column Y} = '2022-10-27'
) AS A
WHERE A.rownum >= 1
AND A.rownum <= 625
ORDER BY rownum)
SELECT CONCAT ("[", JSON_PATH, "]") FROM (
SELECT STRING_AGG (TO_JSON_STRING(t, true)) AS JSON_PATH
FROM TestResult AS t);

My latest observation on this, since I am converting all rows into a single row which may be causing issue of exceeding row data limit size.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,840 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.