CREATE DATABASE SCOPED CREDENTIAL (Transact-SQL)
Creates a database credential. A database credential is not mapped to a server login or database user. The credential is used by the database to access to the external location anytime the database is performing an operation that requires access.
CREATE DATABASE SCOPED CREDENTIAL credential_name WITH IDENTITY = 'identity_name' [ , SECRET = 'secret' ]
To view Transact-SQL syntax for SQL Server 2014 and earlier, see Previous versions documentation.
Specifies the name of the database scoped credential being created. credential_name cannot start with the number (#) sign. System credentials start with ##.
Specifies the name of the account to be used when connecting outside the server.
- To import a file from Azure Blob Storage or Azure Data Lake Storage using a shared key, the identity name must be
SHARED ACCESS SIGNATURE. For more information about shared access signatures, see Using Shared Access Signatures (SAS).
- To import a file from Azure Blob Storage using a managed identity, the identity name must be
- To load data into Azure Synapse Analytics, any valid value can be used for identity.
- When using Kerberos (Windows Active Directory or MIT KDC) do not use the domain name in the IDENTITY argument. It should just be the account name.
- When on SQL Server, if creating a database scoped credential with a Storage Access Key used as the SECRET, IDENTITY is ignored.
The only PolyBase external data source that supports Kerberos authentication is Hadoop. All other external data sources (SQL Server, Oracle, Teradata, MongoDB, generic ODBC) only support Basic Authentication.
WITH IDENTITY is not required if the container in Azure Blob storage is enabled for anonymous access. For an example querying Azure Blob storage, see Importing into a table from a file stored on Azure Blob storage.
Specifies the secret required for outgoing authentication.
SECRET is required to import a file from Azure Blob storage. To load from Azure Blob storage into Azure Synapse Analytics or Parallel Data Warehouse, the Secret must be the Azure Storage Key.
The SAS key value might begin with a '?' (question mark). When you use the SAS key, you must remove the leading '?'. Otherwise your efforts might be blocked.
A database scoped credential is a record that contains the authentication information that is required to connect to a resource outside SQL Server. Most credentials include a Windows user and password.
Before creating a database scoped credential, the database must have a master key to protect the credential. For more information, see CREATE MASTER KEY (Transact-SQL).
When IDENTITY is a Windows user, the secret can be the password. The secret is encrypted using the service master key. If the service master key is regenerated, the secret is re-encrypted using the new service master key.
When granting permissions for a shared access signatures (SAS) for use with a PolyBase external table, select both Container and Object as allowed resource types. If not granted, you may receive error 16535 or 16561 when attempting to access the external table.
Information about database scoped credentials is visible in the sys.database_scoped_credentials catalog view.
Here are some applications of database scoped credentials:
SQL Server uses a database scoped credential to access non-public Azure Blob Storage or Kerberos-secured Hadoop clusters with PolyBase. To learn more, see CREATE EXTERNAL DATA SOURCE (Transact-SQL).
Azure Synapse Analytics uses a database scoped credential to access non-public Azure Blob Storage with PolyBase. To learn more, see CREATE EXTERNAL DATA SOURCE (Transact-SQL). For more information about Azure Synapse storage authentication, see Use external tables with Synapse SQL.
SQL Database uses database scoped credentials for its global query feature. This is the ability to query across multiple database shards.
SQL Database uses database scoped credentials to write extended event files to Azure Blob Storage.
SQL Database uses database scoped credentials for elastic pools. For more information, see Tame explosive growth with elastic databases
BULK INSERT and OPENROWSET use database scoped credentials to access data from Azure Blob Storage. For more information, see Examples of Bulk Access to Data in Azure Blob Storage.
Requires CONTROL permission on the database.
SQL Server 2022
Starting in SQL Server 2022 (16.x) a new type of connector was introduced, using REST-API calls replacing HADOOP. For Azure Blob Storage and Azure Data Lake Gen 2 the only supported authentication method supported is
SHARED ACCESS SIGNATURE.
Please refer to create external data source for more information.
A. Creating a database scoped credential for your application
The following example creates the database scoped credential called
AppCred. The database scoped credential contains the Windows user
Mary5 and a password.
-- Create a db master key if one does not already exist, using your own password. CREATE MASTER KEY ENCRYPTION BY PASSWORD='<EnterStrongPasswordHere>'; -- Create a database scoped credential. CREATE DATABASE SCOPED CREDENTIAL AppCred WITH IDENTITY = 'Mary5', SECRET = '<EnterStrongPasswordHere>';
B. Creating a database scoped credential for a shared access signature
The following example creates a database scoped credential that can be used to create an external data source, which can do bulk operations, such as BULK INSERT and OPENROWSET. Shared Access Signatures cannot be used with PolyBase in SQL Server, APS or Azure Synapse Analytics.
-- Create a db master key if one does not already exist, using your own password. CREATE MASTER KEY ENCRYPTION BY PASSWORD='<EnterStrongPasswordHere>'; -- Create a database scoped credential. CREATE DATABASE SCOPED CREDENTIAL MyCredentials WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'QLYMgmSXMklt%2FI1U6DcVrQixnlU5Sgbtk1qDRakUBGs%3D';
C. Creating a database scoped credential for PolyBase Connectivity to Azure Data Lake Store
The following example creates a database scoped credential that can be used to create an external data source, which can be used by PolyBase in Azure Synapse Analytics.
Azure Data Lake Store uses an Azure Active Directory Application for Service to Service Authentication. Please create an AAD application and document your client_id, OAuth_2.0_Token_EndPoint, and Key before you try to create a database scoped credential.
-- Create a db master key if one does not already exist, using your own password. CREATE MASTER KEY ENCRYPTION BY PASSWORD='<EnterStrongPasswordHere>'; -- Create a database scoped credential. CREATE DATABASE SCOPED CREDENTIAL ADL_User WITH IDENTITY = '<client_id>@<OAuth_2.0_Token_EndPoint>', SECRET = '<key>' ;