Error when copying data from Big Query in Azure Data Factory

Hybalo, Oleksandr 6 Reputation points
2022-06-23T08:18:00.317+00:00

Hello, I am faced with error when try to copy data set from Google Big Query. It works fine with small data set (10 000 rows), but when I try to extract for ex. 100 000 I heve the bellow error.
For connection I am using User authentication method
I do not violate any of the Google Cloud limits

Failure happened on 'Source' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API. Check your account permissions.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=System.Data.Odbc.OdbcException,Message=ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API. Check your account permissions.,Source=Microsoft ODBC Driver for Google BigQuery,'

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,533 questions
{count} vote

1 answer

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,441 Reputation points Microsoft Employee
    2022-06-24T12:32:56.657+00:00

    Hi @Hybalo, Oleksandr ,

    Thank you for posting query in Microsoft Q&A Platform.

    Error says its authentication issue but as you mentioned its working fine with minimal data. So that means Authentication is not an issue. In case of Google Big query there are couple of important points we should be knowing while handling huge requests or huge data. Below I listed them kindly make sure to follow them and see if that helps.

    This Google BigQuery connector is built on top of the BigQuery APIs. Be aware that BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis, refer to Quotas & Limits - API requests. Make sure you do not trigger too many concurrent requests to the account.

    The minimum scope required to obtain an OAuth 2.0 refresh token is https://www.googleapis.com/auth/bigquery.readonly. If you plan to run a query that might return large results, other scope might be required. For more information, refer to this article.

    Please click here to complete documentation of Google big query connector in Azure data factory.

    Please let us know how it goes.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.