Snowflake (using Azure Functions) connector for Microsoft Sentinel
The Snowflake data connector provides the capability to ingest Snowflake login logs and query logs into Microsoft Sentinel using the Snowflake Python Connector. Refer to Snowflake documentation for more information.
This is autogenerated content. For changes, contact the solution provider.
Connector attributes
Connector attribute | Description |
---|---|
Log Analytics table(s) | Snowflake_CL |
Data collection rules support | Not currently supported |
Supported by | Microsoft Corporation |
Query samples
All Snowflake Events
Snowflake_CL
| sort by TimeGenerated desc
Prerequisites
To integrate with Snowflake (using Azure Functions) make sure you have:
- Microsoft.Web/sites permissions: Read and write permissions to Azure Functions to create a Function App is required. See the documentation to learn more about Azure Functions.
- Snowflake Credentials: Snowflake Account Identifier, Snowflake User and Snowflake Password are required for connection. See the documentation to learn more about Snowflake Account Identifier. Instructions on how to create user for this connector you can find below.
Vendor installation instructions
Note
This connector uses Azure Functions to connect to the Azure Blob Storage API to pull logs into Microsoft Sentinel. This might result in additional costs for data ingestion and for storing data in Azure Blob Storage costs. Check the Azure Functions pricing page and Azure Blob Storage pricing page for details.
(Optional Step) Securely store workspace and API authorization key(s) or token(s) in Azure Key Vault. Azure Key Vault provides a secure mechanism to store and retrieve key values. Follow these instructions to use Azure Key Vault with an Azure Function App.
Note
This data connector depends on a parser based on a Kusto Function to work as expected Snowflake which is deployed with the Microsoft Sentinel Solution.
STEP 1 - Creating user in Snowflake
To query data from Snowflake you need a user that is assigned to a role with sufficient privileges and a virtual warehouse cluster. The initial size of this cluster will be set to small but if it is insufficient, the cluster size can be increased as necessary.
Enter the Snowflake console.
Switch role to SECURITYADMIN and create a new role:
USE ROLE SECURITYADMIN; CREATE OR REPLACE ROLE EXAMPLE_ROLE_NAME;
Switch role to SYSADMIN and create warehouse and grand access to it:
USE ROLE SYSADMIN; CREATE OR REPLACE WAREHOUSE EXAMPLE_WAREHOUSE_NAME WAREHOUSE_SIZE = 'SMALL' AUTO_SUSPEND = 5 AUTO_RESUME = true INITIALLY_SUSPENDED = true; GRANT USAGE, OPERATE ON WAREHOUSE EXAMPLE_WAREHOUSE_NAME TO ROLE EXAMPLE_ROLE_NAME;
Switch role to SECURITYADMIN and create a new user:
USE ROLE SECURITYADMIN; CREATE OR REPLACE USER EXAMPLE_USER_NAME PASSWORD = 'example_password' DEFAULT_ROLE = EXAMPLE_ROLE_NAME DEFAULT_WAREHOUSE = EXAMPLE_WAREHOUSE_NAME;
Switch role to ACCOUNTADMIN and grant access to snowflake database for role.
USE ROLE ACCOUNTADMIN; GRANT IMPORTED PRIVILEGES ON DATABASE SNOWFLAKE TO ROLE EXAMPLE_ROLE_NAME;
Switch role to SECURITYADMIN and assign role to user:
USE ROLE SECURITYADMIN; GRANT ROLE EXAMPLE_ROLE_NAME TO USER EXAMPLE_USER_NAME;
IMPORTANT: Save user and API password created during this step as they will be used during deployment step.
STEP 2 - Choose ONE from the following two deployment options to deploy the connector and the associated Azure Function
IMPORTANT: Before deploying the data connector, have the Workspace ID and Workspace Primary Key (can be copied from the following), as well as Snowflake credentials, readily available.
Next steps
For more information, go to the related solution in the Azure Marketplace.