Azure Data Factory: Operation Time Out Error

Meera Karthik 1 Reputation point
2021-10-05T21:14:57.747+00:00

I'm creating a data flow on ADF and have connected to Azure Blob for the source. after connecting and importing a schema, I was trying to see the data preview, but keep receiving these two error messages (1) Could not fetch statistics due to operation timeout and (2) Failed to fetch data preview due to operation time out. Would greatly appreciate any support in figuring out what is causing this/preventing me from previewing the data

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,805 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,427 Reputation points Microsoft Employee
    2021-10-07T01:02:58.923+00:00

    Hi @Meera Karthik ,

    Welcome to Microsoft Q&A forum and thanks for reaching out here.

    By looking at the error message it seems like the data preview call is failing to read the source data. Could you please confirm if your source file is too big? In ADF mapping data flow file sources only limit the rows that you see, not the rows being read. For very large datasets, it is recommended that you take a small portion of that file and use it for your testing. You can select a temporary file in Debug Settings for each source that is a file dataset type.

    The data preview will only query the number of rows that you have set as your limit in your debug settings. Once you turn on debug mode, you can edit how a data flow previews data. Debug settings can be edited by clicking "Debug Settings" on the Data Flow canvas toolbar. You can select the row limit or file source to use for each of your Source transformations here. The row limits in this setting are only for the current debug session.

    The default IR used for debug mode in data flows is a small 4-core single worker node with a 4-core single driver node. This works fine with smaller samples of data when testing your data flow logic. If you expand the row limits in your debug settings during data preview or set a higher number of sampled rows in your source during pipeline debug, then you may wish to consider setting a larger compute environment in a new Azure Integration Runtime. Then you can restart your debug session using the larger compute environment.

    Please do try with a sample subset data from your original source data and see if you are able to preview data.

    Here is the reference doc related to data preview: Mapping data flow data preview

    Hope this info helps. Do let us know how it goes.

    ----------

    • Please don't forget to click on 130616-image.png and upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.