Sentinel to Blob

Soumya Banerjee 126 Reputation points
2022-04-18T13:10:47.657+00:00

I want to transfer the data from log analytics workspace(sentinel) to Azure Blob through Data export option. I have setup 30 days retention in LAW.

From 31st day I want the data in BLOB storage . Is there a way to send the data only after 30 days from LAW to Az Blob through data export ? (Although the retention for first 30 days is anyways free in LAW)

Data in LAW( 30 days) ---> Data Export ----> Az BLOB ( 31st -180 days) ------> Automatically Delete data from Blob which has crossed 180 days

I would want the data to be deleted automatically which is stored in Blob for 180 days.

In storage lifecycle management, I can see an option as Base Blobs " Last modified " . If I select 30 days here , so will it only move the data which has crossed 30 days in LAW and not the current one?

Similarly to delete data that has crossed 180 days should I select if Base blob were last modified than 180 days ?

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
2,669 questions
0 comments No comments
{count} votes

4 answers

Sort by: Most helpful
  1. Andrew Blumhardt 9,491 Reputation points Microsoft Employee
    2022-04-18T13:58:30.13+00:00

    My understanding is that the data export feature is sent simultaneously at ingestion (no option for after the retention period). The difference once the data starts flowing (during or after) is really unimportant (at least not in the 30-90 day range). You could be more targeted with a logic app but the added cost and complexity may not be worth the effort. https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-data-export

    Sentinel increased the free retention period to 90 days. Extended storage in Log Analytics is very cost effective in the short term. If you only require 180 days I think you will find the added 90-days within log analytics to be affordable and simple to manage. Extended retention in log analytics can be expensive for large datasets after 8-12 months. Our pricing calculator can help with extended storage budgeting if needed.

    Also note that Log Analytics supports table-level retention settings. This is set with an ARM template if you have varying retention requirements for individual tables. This option can make native retention more affordable.

    You should also check out the new basic logs, archive option, and ingestion-time filtering. These new options will have a big impact on archival strategy. https://techcommunity.microsoft.com/t5/azure-observability-blog/the-next-evolution-of-azure-monitor-logs/ba-p/3143195

    1 person found this answer helpful.

  2. Andrew Blumhardt 9,491 Reputation points Microsoft Employee
    2022-04-18T18:42:50.717+00:00

    I know very little about the blob storage tiers or lifecycle management options. Maybe someone else can chime in there. Maybe run a short test to verify.

    I would still recommend working with your client on a more manageable solution.

    This may help: https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview


  3. Sudipta Chakraborty - MSFT 1,096 Reputation points Microsoft Employee
    2022-04-18T19:01:23.697+00:00

    @Soumya Banerjee :

    The following sample BLOB lifecycle management rule filters the account to run the actions on objects that exist inside sample-container and start with blob1.

    Actions:

    • Tier blob to cool tier 30 days after last modification
    • Tier blob to archive tier 90 days after last modification
    • Delete blob 2,555 days (seven years for you it will be 180 days) after last modification
    • Delete previous versions 90 days after creation {
      "rules": [
      {
      "enabled": true,
      "name": "sample-rule",
      "type": "Lifecycle",
      "definition": {
      "actions": {
      "version": {
      "delete": {
      "daysAfterCreationGreaterThan": 90
      }
      },
      "baseBlob": {
      "tierToCool": {
      "daysAfterModificationGreaterThan": 30
      },
      "tierToArchive": {
      "daysAfterModificationGreaterThan": 90
      },
      "delete": {
      "daysAfterModificationGreaterThan": 2555
      }
      }
      },
      "filters": {
      "blobTypes": [
      "blockBlob"
      ],
      "prefixMatch": [
      "sample-container/blob1"
      ]
      }
      }
      }
      ]
      }

  4. Clive Watson 5,711 Reputation points MVP
    2022-04-19T06:31:03.83+00:00

    What will you do with the data in Blob? Sentinel is moving to the (preview) archive feature for 90-7years retention (the data will auto age off after the period is reached).
    This feature allows for full searching (using KQL) and restore per Table (as Andrew mentioned https://learn.microsoft.com/en-gb/azure/azure-monitor/logs/cost-logs#log-data-retention-and-archive). If you plan to use the data again then this IMO is the better option, if you never plan to use the data the BLOB may be an option. You set Archive today per table, and that's it (via REST api https://learn.microsoft.com/en-gb/azure/azure-monitor/logs/data-retention-archive?tabs=api-1%2Capi-2#set-retention-and-archive-policy-by-table or via a Sentinel Workbook in the Github).

    You need to check the cost difference between BLOB and archive the way you plan to use the data (as you dont say I dont know what you will do with the data in the BLOB).

    Over the years I've seen many people move data to BLOB with no plan to be able to use it again, then they have problems when they try, normally when they need to do a quick check!