Azure SQL - Auditin

Zippo706 6 Reputation points
2021-05-02T16:28:00.073+00:00

Adding auditing in easy in SQL Azure (non managed), however the field list is HUGE and most of it is useless from a security perspective. Is there a way i can audit the database, but only with certain fields, Ie dbname, upn, ip, client name, statement,duration, login ok, login fail. Other things like sid, hash, etc are not required.

Is there a way to do this?
Will this screw up LogAnalytics/Sentinel?

Azure SQL Database
{count} votes

2 answers

Sort by: Most helpful
  1. Alberto Morillo 32,886 Reputation points MVP
    2021-05-02T18:01:24.233+00:00

    If you wish to audit logins successful/unsuccessful connections, once you enable audit you open View Audit Logs and can search for successful and unsuccessful login events. It should provide you useful information like hostname and IP address. Please click here for more details.

    You can also audit successful/unsuccessful authentications and DML T-SQL statements for an specific database using PowerShell as shown below:

     Set-AzureRmSqlDatabaseAuditing -ResourceGroupName "resourceGroup" -ServerName "SQL Server Name" -DatabaseName "AdventureWorksLT" -StorageAccountName "storageAccount" -AuditActionGroup "SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP", "FAILED_DATABASE_AUTHENTICATION_GROUP", "BATCH_COMPLETED_GROUP" -AuditAction "UPDATE ON database::[AdventureWorksLT] BY [public]"  -RetentionInDays 60
    

    Another example, this time how to collect who dropped database objects on a specific database and schema:

    Set-AzureRmSqlDatabaseAuditing `
    -State Enabled `
    -ResourceGroupName "resourcegroupname" `
    -ServerName "ssqlinstancename" `  #ssqlinstancename.database.windows.net
    -StorageAccountName "strageaccountname" `
    -DatabaseName "dbname" `
    -AuditActionGroup 'SCHEMA_OBJECT_CHANGE_GROUP' `
    -RetentionInDays 8 `
    -AuditAction "DELETE ON schema::dbo BY [public]"
    

    Azure SQL Auditing should provide you the application name, duration, login name and client IP address that executed the query if you look for the event type BATCH_COMPLETED once you have enable Auditing. Please see above examples.

    The Auditing feature won't screw up Log Analytics, Azure Monitor or Intelligence Performance features.


  2. Zippo706 6 Reputation points
    2021-05-02T20:35:53.367+00:00

    Thank you for the response. The goal here is to audit the database, but
    Will it act secondary to the server audit. IE. Full Server, I audit to blobs, but database DB1, I want to send to loganalytics (and sentinel) or splunk via event hub.

    My use case is:
    someone in headquarters or over vpn used the production web to sql account to log in to the database. I would know that as the IP address would be something like 55.55.55.55 or client hostname would not match webserver1, webserver2, etc. I don't want to audit a select or update statement from webserver1, webserver2, etc.
    2nd use case.
    Production support has the ability to update the db. They would use their domain\user account w/ MFA. However, i want to catch the statement that was made by them (ie: delete from user where username='userx' or update user where username='userx'. There is change control, but developers/support don't always adhere to it. Their IP would always be 55.55.55.55.

    Fyi, logging everything to blob isn't very helpful as its in xel format and only use that as a full fallback. Querying the blobs via fn autit queries takes 15 min per query.

    Again, I know i can capture this in full server logging, but our logs are 100 gig/day and have several (sometimes 30 or more) minutes of latency in both log analytics and splunk as they have to index not only every request, but logging fields it really doesn't need (SID, has, etc).

    Would this be a better case for events?

    Really just looking for soime direction here.