Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest SQL, Fabric and Power BI learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Applies to:
SQL Server
SSIS Integration Runtime in Azure Data Factory
The HDFS File Destination component enables an SSIS package to write data to a HDFS file. The supported file formats are Text, Avro, and ORC.
To configure the HDFS File Destination, drag and drop the HDFS File Source on the data flow designer and double-click the component to open the editor.
Configure the following options on the General tab of the Hadoop File Destination Editor dialog box.
Field | Description |
---|---|
Hadoop Connection | Specify an existing Hadoop Connection Manager or create a new one. This connection manager indicates where the HDFS files are hosted. |
File Path | Specify the name of the HDFS file. |
File format | Specify the format for the HDFS file. The available options are Text, Avro, and ORC. |
Column delimiter character | If you select Text format, specify the column delimiter character. |
Column names in the first data row | If you select Text format, specify whether the first row in the file contains column names. |
After you configure these options, select the Columns tab to map source columns to destination columns in the data flow.
Java is required to use ORC file format. Architecture (32/64-bit) of the Java build should match that of the SSIS runtime to use. The following Java builds have been tested.
sysdm.cpl
.JAVA_HOME
for the Variable name.jre
subfolder.
Then select OK, and the Variable value is populated automatically.Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest SQL, Fabric and Power BI learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register today