BCP works, used the bulk insert from azure data factory
There is no SQL script to perform the bulk operation in dedicated pool.
COPY INTO erroring out because of Polybase size limit
HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopExecutionException: The size of the schema/row at ordinal 44555 is 1408314 bytes. It exceeds the maximum allowed row size of 1000000 bytes for Polybase.
Getting about error while executing the below command.
any help would be highly appreciated!
COPY INTO table
FROM ''
WITH (
FILE_TYPE = 'PARQUET'--,
--MAXERRORS = 100
)
2 answers
Sort by: Most helpful
-
Lokesh 211 Reputation points
2023-02-08T10:34:17.5166667+00:00 -
BhargavaGunnam-MSFT 25,876 Reputation points Microsoft Employee
2023-02-08T20:06:48.2566667+00:00 Hello @Loki,
Welcome to the MS Q&A platform.
The max length of the intermittent external table in polybase(behind the scenes, PolyBase creates an External Table) is nvarchar(4000).
So, if the max length of the source column is greater than nvarchar(4000), polybase option can’t be used.
You can use the bulk insert option or copy into option in copy activity.
Please see the Row size and data type limits using PolyBase.
Reference document:
I hope this helps. Please let me know if you have any further questions.