Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest SQL, Fabric and Power BI learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Microsoft JDBC Driver for SQL Server version 9.2 and above supports using the Bulk Copy API for batch insert operations. This feature allows users to enable the driver to do Bulk Copy operations underneath when executing batch insert operations. The driver aims to achieve improvement in performance while inserting the same data as the driver would have with regular batch insert operation. The driver parses the user's SQL Query, using the Bulk Copy API instead of the usual batch insert operation. The following settings are various ways to enable the Bulk Copy API for batch insert feature and lists its limitations. This page also contains a small sample code that demonstrates a usage and the performance increase as well.
This feature is only applicable to PreparedStatement and CallableStatement's executeBatch()
& executeLargeBatch()
APIs.
Prerequisite to enable Bulk Copy API for batch insert.
There are three ways to enable Bulk Copy API for batch insert.
Adding useBulkCopyForBatchInsert=true;
to the connection string enables this feature.
Connection connection = DriverManager.getConnection("jdbc:sqlserver://<server>:<port>;userName=<user>;password=<password>;database=<database>;encrypt=true;useBulkCopyForBatchInsert=true;");
Calling SQLServerConnection.setUseBulkCopyForBatchInsert(true) enables this feature.
SQLServerConnection.getUseBulkCopyForBatchInsert() retrieves the current value for useBulkCopyForBatchInsert connection property.
The value for useBulkCopyForBatchInsert stays constant for each PreparedStatement at the time of its initialization. Subsequent calls to SQLServerConnection.setUseBulkCopyForBatchInsert() don't affect the already created PreparedStatement's value.
Similar to the previous option, but using SQLServerDataSource to create a SQLServerConnection object. Both methods achieve the same result.
There are currently these limitations that apply to this feature.
INSERT INTO TABLE VALUES (?, 2
)), isn't supported. Wildcards (?) are the only supported parameters for this function.INSERT INTO TABLE SELECT * FROM TABLE2
), isn't supported.INSERT INTO TABLE VALUES (1, 2) (3, 4)
), isn't supported.IDENTITY_INSERT
isn't managed in the driver. Either don't include identity columns in insert statements, manually set the IDENTITY_INSERT
state of your tables between batch insert statements, or manually pass the explicit value for an identity column with the insert statement. For more information, see SET IDENTITY_INSERT.MONEY
, SMALLMONEY
, DATE
, DATETIME
, DATETIMEOFFSET
, SMALLDATETIME
, TIME
, GEOMETRY
, and GEOGRAPHY
data types, are currently not supported for this feature.If the query fails because of errors unrelated to the SQL Server instance, the driver logs the error message and falls back to the original logic for batch insert.
This example demonstrates the use case for a batch insert operation of a thousand rows, for both regular vs Bulk Copy API scenarios.
public static void main(String[] args) throws Exception
{
String tableName = "batchTest";
String tableNameBulkCopyAPI = "batchTestBulk";
String connectionUrl = "jdbc:sqlserver://<server>:<port>;encrypt=true;databaseName=<database>;user=<user>;password=<password>";
try (Connection con = DriverManager.getConnection(connectionUrl);
Statement stmt = con.createStatement();
PreparedStatement pstmt = con.prepareStatement("insert into " + tableName + " values (?, ?)");) {
String dropSql = "if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[" + tableName + "]') and OBJECTPROPERTY(id, N'IsUserTable') = 1) DROP TABLE [" + tableName + "]";
stmt.execute(dropSql);
String createSql = "create table " + tableName + " (c1 int, c2 varchar(20))";
stmt.execute(createSql);
System.out.println("Starting batch operation using regular batch insert operation.");
long start = System.currentTimeMillis();
for (int i = 0; i < 1000; i++) {
pstmt.setInt(1, i);
pstmt.setString(2, "test" + i);
pstmt.addBatch();
}
pstmt.executeBatch();
long end = System.currentTimeMillis();
System.out.println("Finished. Time taken : " + (end - start) + " milliseconds.");
}
try (Connection con = DriverManager.getConnection(connectionUrl + ";useBulkCopyForBatchInsert=true");
Statement stmt = con.createStatement();
PreparedStatement pstmt = con.prepareStatement("insert into " + tableNameBulkCopyAPI + " values (?, ?)");) {
String dropSql = "if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[" + tableNameBulkCopyAPI + "]') and OBJECTPROPERTY(id, N'IsUserTable') = 1) DROP TABLE [" + tableNameBulkCopyAPI + "]";
stmt.execute(dropSql);
String createSql = "create table " + tableNameBulkCopyAPI + " (c1 int, c2 varchar(20))";
stmt.execute(createSql);
System.out.println("Starting batch operation using Bulk Copy API.");
long start = System.currentTimeMillis();
for (int i = 0; i < 1000; i++) {
pstmt.setInt(1, i);
pstmt.setString(2, "test" + i);
pstmt.addBatch();
}
pstmt.executeBatch();
long end = System.currentTimeMillis();
System.out.println("Finished. Time taken : " + (end - start) + " milliseconds.");
}
}
Result:
Starting batch operation using regular batch insert operation.
Finished. Time taken : 104132 milliseconds.
Starting batch operation using Bulk Copy API.
Finished. Time taken : 1058 milliseconds.
Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest SQL, Fabric and Power BI learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayTraining
Module
Process bulk data in Azure Cosmos DB for NoSQL - Training
Perform bulk operations on Azure Cosmos DB in bulk from code using the SDK.