Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This article describes the connection properties supported by the Databricks JDBC Driver (OSS).
Authentication and proxy properties
The following authentication and proxy properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
AsyncExecPollInterval |
200 |
The time in milliseconds between each poll for the asynchronous query execution status. Asynchronous refers to the fact that the RPC call used to execute a query against Spark is asynchronous. It does not mean that JDBC asynchronous operations are supported. |
Auth_Flow |
0 |
The OAuth2 authentication flow for the driver connection. This property is required if AuthMech is 11 . |
Auth_JWT_Key_File |
null |
The path to the private key file (PEM format) for JWT authentication. |
Auth_JWT_Alg |
RS256 |
The algorithm for private key JWT authentication. The supported algorithms are: RSA: RS256, RS384, RS512, PS256, PS384, PS512 and EC: ES256, ES384, ES512 |
Auth_JWT_Key_Passphrase |
null |
The passphrase for decrypting an encrypted private key. |
Auth_KID |
null |
The Key Identifier (KID) required for JWT authentication. This is mandatory when using private key JWT. |
AuthMech |
Required | The authentication mechanism, where 3 specifies the mechanism is a Azure Databricks personal access token, and 11 specifies the mechanism is OAuth 2.0 tokens. Additional properties are required for each mechanism. See Authenticate the driver. |
CFProxyAuth |
0 |
If set to 1 , the driver uses the proxy authentication user and password, represented by CFProxyUID and CFProxyPwd . |
CFProxyHost |
null |
A string that represents the name of the proxy host to use when UseCFProxy is also set to 1 . |
CFProxyPort |
null |
An integer that represents the number of the proxy port to use when UseCFProxy is also set to 1 . |
CFProxyUID |
null |
A string that represents the username to use for proxy authentication when CFProxyAuth and UseCFProxy are also set to 1 . |
CFProxyPwd |
null |
A string that represents the password to use for proxy authentication when CFProxyAuth and UseCFProxy are also set to 1 . |
ConnCatalog or catalog |
SPARK |
The name of the default catalog to use. |
ConnSchema or schema |
default |
The name of the default schema to use. This can be specified either by replacing <schema> in the URL with the name of the schema to use or by setting the ConnSchema property to the name of the schema to use. |
GoogleServiceAccount |
null |
Enables authentication using a Google service account. |
GoogleCredentialsFile |
null |
The path to the JSON key file for Google Service account authentication. |
EnableOIDCDiscovery |
1 |
If set to 1 , the OpenID Connect discovery URL is used. |
OIDCDiscoveryEndpoint |
null |
The OpenID Connect discovery URL for retrieving the OIDC configuration. |
Auth_RefreshToken |
null |
The OAuth2 refresh token used to retrieve a new access token. |
OAuth2ConnAuthAuthorizeEndpoint |
null |
The authorization endpoint URL used in an OAuth2 flow. |
OAuth2ConnAuthTokenEndpoint |
null |
The token endpoint URL for the OAuth2 flow. |
ProxyAuth |
0 |
If set to 1 , the driver uses the proxy authentication user and password, represented by ProxyUID and ProxyPwd . |
ProxyHost |
null |
A string that represents the name of the proxy host to use when UseProxy is also set to 1 . |
ProxyPort |
null |
An integer that represents the number of the proxy port to use when UseProxy is also set to 1 . |
ProxyPwd |
null |
A string that represents the password to use for proxy authentication when ProxyAuth and UseProxy are also set to 1 . |
ProxyUID |
null |
A string that represents the username to use for proxy authentication when ProxyAuth and UseProxy are also set to 1 . |
UseProxy |
0 |
If set to 1 , the driver uses the provided proxy settings (for example: ProxyAuth , ProxyHost , ProxyPort , ProxyPwd , and ProxyUID ). |
UseSystemProxy |
0 |
If set to 1 , the driver uses the proxy settings that have been set at the system level. If any additional proxy properties are set in the connection URL, these additional proxy properties override those that have been set at the system level. |
UseCFProxy |
0 |
If set to 1 , the driver uses the cloud fetch proxy settings if they are provided, otherwise use the regular proxy. |
UseJWTAssertion |
false |
Enables private key JWT authentication for M2M use cases where client secret authentication is restricted. |
SSL trust store configuration properties
The following SSL trust store configuration properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
AllowSelfSignedCerts |
0 |
If set to 1 , the driver allows connections to servers with self-signed SSL certificates. |
CheckCertificateRevocation |
0 |
If set to 1 , the driver checks whether the SSL certificate has been revoked. |
SSL |
1 |
Whether the connector communicates with the Spark server through an SSL-enabled socket. |
SSLTrustStore |
null |
The path to the trust store file for SSL certificate validation. |
SSLTrustStorePassword |
null |
The password for the trust store file, if it is password-protected. |
SSLTrustStoreType |
JKS |
The type of the trust store, for example, JKS or PKCS12. If not specified, the driver defaults to JKS trust store. Valid types are JKS , PKCS12 , and BCFKS . |
UseSystemTrustStore |
0 |
If set to 1 , the driver uses the system's default trust store for SSL certificate verification. |
Trust store types
The JDBC driver supports the following SSL modes and trust store types.
Self-signed certificate mode
To use self-signed certificate mode, set the connection property AllowSelfSignedCerts=1
. This mode uses a trust-all socket factory that accepts any certificate.
Custom trust store
To use a custom trust store, specify a custom trust store file in the SSLTrustStore
connection property. This trust store is loaded directly from the specified path and uses the certificates for SSL certificate validation. It can be in JKS, PKCS12, or other supported formats.
You must specify the following additional connection properties:
SSLTrustStore
: Path to the trust store fileSSLTrustStorePassword
: Password for the trust store (if needed)SSLTrustStoreType
: Type of trust store (optional, defaults to JKS if not specified)
Java system property trust store
To use the system property trust store, set UseSystemTrustStore=1
and make sure that you do not specify a custom trust store. Instead, specify a trust store using the Java system property javax.net.ssl.trustStore
. This property is set at the JVM level using the -D
flag, for example:
java -Djavax.net.ssl.trustStore=/path/to/truststore.jks -Djavax.net.ssl.trustStorePassword=changeit ...
The JDBC driver first checks for the Java system property javax.net.ssl.trustStore
. If it is set, it uses this trust store file instead of the JDK's default. If no system property is set, uses the JDK's default trust store (cacerts), which is ocated at $JAVA_HOME/lib/security/cacerts
or similar path.
JDK default trust store (cacerts)
The JDK comes with a built-in trust store called cacerts which contains certificates from well-known Certificate Authorities, which allows verification of certificates issued by those CAs. This trust store is typically located at $JAVA_HOME/lib/security/cacerts
with a default password "changeit" or "changeme".
To use the JDK default trust store, set UseSystemTrustStore=1
and make sure that you do not specify a custom trust store or a Java system property trust store. If a trust store is also specified using the Java system property javax.net.ssl.trustStore
, it is ignored, which ensures the driver only uses certificates from the default JDK trust store.
Trust store order of precedence
The driver uses the following order of precedence to determine which trust store to use:
- The custom trust store specified in the
SSLTrustStore
connection property - The trust store specified in the Java system property
javax.net.ssl.trustStore
(whenUseSystemTrustStore=1
) - The JDK's default trust store (cacerts)
Security recommendations
To keep your connection secure, Databricks recommends the following:
For production environments:
- Do not use self-signed certificate mode (
AllowSelfSignedCerts=1
). - Use official CA-signed certificates.
- Use
UseSystemTrustStore=1
unless you need a custom trust store.
- Do not use self-signed certificate mode (
For custom trust stores:
- Use when connecting to servers with certificates not in the default trust store.
- Ensure the trust store contains the complete certificate chain.
- Protect trust store files with appropriate permissions.
SQL configuration properties
The following SQL configuration properties are supported by the Databricks JDBC Driver (OSS). These are also described in Configuration parameters. Properties are case insensitive.
Property | Default value | Description |
---|---|---|
ansi_mode |
TRUE |
Whether to enable strict ANSI SQL behavior for certain functions and casting rules. |
enable_photon |
TRUE |
Whether to enable the Photon vectorized query engine. |
legacy_time_parser_policy |
EXCEPTION |
The methods used to parse and format dates and timestamps. Valid values are EXCEPTION , LEGACY , and CORRECTED . |
max_file_partition_bytes |
128m |
The maximum number of bytes to pack into a single partition when reading from file based sources. The setting can be any positive integer and optionally include a measure such as b (bytes), k or kb (1024 bytes). |
read_only_external_metastore |
false |
Controls whether an external metastore is treated as read-only. |
statement_timeout |
172800 |
Sets a SQL statement timeout between 0 and 172800 seconds. |
timezone |
UTC |
Set the local timezone. Region IDs in the form area/city , such as America/Los_Angeles or zone offsets in the format (+|-)HH, (+|-)HH:mm or (+|-)HH:mm:ss, e.g -08, +01:00 or -13:33:33. Also, UTC is supported as an alias for +00:00 |
use_cached_result |
true |
Whether Databricks SQL caches and reuses results whenever possible. |
Logging properties
The following logging properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
LogLevel |
OFF |
The logging level, which is a value 0 through 6:
Use this property to enable or disable logging in the connector and to specify the amount of detail included in log files. |
LogPath |
To determine the default path for logs, the driver uses the value set for these system properties, in this priority order:
|
The full path to the folder where the connector saves log files when logging is enabled, as a string. To ensure that the connection URL is compatible with all JDBC applications, escape the backslashes (\ ) in your file path by typing another backslash.If the LogPath value is invalid, the connector sends the logged information to the standard output stream (System.out). |
LogFileSize |
No maximum | The maximum allowed log file size, specified in MB |
LogFileCount |
No maximum | The maximum number of allowed log files |
Enable and configure logging
The JDBC driver supports the Simple Logging Facade for Java (SLF4J) and java.util.logging (JUL) frameworks. The driver uses the JUL logging framework by default.
To enable and configure logging for the JDBC driver:
Enable the logging framework that you want to use:
- For SLF4J logging, set the system property
-Dcom.databricks.jdbc.loggerImpl=SLF4JLOGGER
and provide the SLF4J binding implementation (compatible with SLF4J version 2.0.13 and above) and corresponding configuration file in the classpath. - For JUL logging, set the system property
-Dcom.databricks.jdbc.loggerImpl=JDKLOGGER
. This is the default.
- For SLF4J logging, set the system property
Set the
LogLevel
property on the connection string to the desired level of information to include in log files.Set the
LogPath
property on the connection string to the full path to the folder where you want to save log files.For example, the following connection URL enables logging level 6 and saves the log files to the C:temp folder:
jdbc: databricks://localhost:11000;LogLevel=6;LogPath=C:\\temp
Restart your JDBC application and reconnect to the server to apply the settings.
Other feature properties
The following properties enable features in the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
EnableComplexDatatypeSupport |
0 |
If set to 1 , support for complex data types (ARRAYs, STRUCTs, MAPs) as native Java objects instead of strings is enabled. |
EnableTelemetry |
0 |
If set to 1 , telemetry is enabled. See Telemetry. |
UserAgentEntry |
browser |
The User-Agent entry to be included in the HTTP request. This value is in the following format: [ProductName]/[ProductVersion] [Comment] |
UseThriftClient |
1 |
Whether the JDBC driver should use the Thrift client or Statement Execution APIs. |
VolumeOperationAllowedLocalPaths |
`` | The comma separated list of allowed local paths for downloading and uploading of UC Volume Ingestion files. The paths include sub-directories as well. See Manage files using volumes. |
VolumeOperationRetryableHttpCode |
408,502,503,504 |
The comma separated list of retryable HTTP codes for Unity Catalog volume ingestion. |
VolumeOperationRetryTimeout |
15 |
The retry timeout in minutes for Unity Catalog volume ingestion HTTP requests. |
Telemetry collection
Telemetry enables Databricks to streamline debugging and provide timely troubleshooting by collecting:
- Client environment details (driver version, runtime, OS details)
- JDBC connection configurations (excludes any PII data)
- Operation latency measurements
- Execution result format (inline JSON, Arrow, etc.)
- Operation types (execution query, metadata query, volume operations)
- Error classification data
- Retry counts
Note
Databricks maintains strict privacy standards ensuring no collection of query content, results, or personally identifiable information (PII).