Data Factory Copy Activity - found more columns than expected

Ryan Abbey 1,176 Reputation points

We are getting the error : found more columns than expected column count 50

A scan of all rows has determined every row has 50 columns and the row in question looks exactly like every row prior to it (which presumably didn't have an issue) and with data actually being numeric (except for three columns which only have a single character each) so there's no mismatch of quotes/embedded delimiter etc

I've run the rows around the offending row through a hex converter and no mystery characters identified and all rows have the same row delimiter. Anyone found any cause outside of an embedded delimiter?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,776 questions
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. HimanshuSinha-msft 19,381 Reputation points Microsoft Employee

    Hello @Ryan Abbey ,
    Thanks for the ask and using the Microsoft Q&A platform .

    This is a very data-ecentric ask and without looking at the data its very difficult to help here . Do you think you can share the csv file ( may be the size of the file can be an issue ) ? If you think sharing the file here is an issue , you can email the file at .

    One thing which you can try ( if possible ) , ingest the csv file to a sql table . But make sure that you choose the auto create table option . I am assuming with that option set the ADF will not complain about the mismath and will create 51st column and then you can query on that table do find that row .

    Please do let me know how it goes .

    0 comments No comments

  2. Ryan Abbey 1,176 Reputation points

    Hi Himanshu, I re-ran the process with "Fault tolerance" configured. The rows it threw out were a merge of two rows in the file but those rows in the file are definitely not "merged", there's a CRLF between the two! We zipped using gzip and ran that through without any error so suggesting it's an issue with ZipDeflate

    PS the file was large - 1.5GB - but our course of action is going to be postponed...

  3. Han Shih 施學翰 146 Reputation points

    I am facing the same situation.

    My CSV file have values like this "123,456", which I think cause this error.

    However, I check "preview data", it looks pretty good.

    Any suggestions?

    0 comments No comments