AKS application logs in Log Analytics Workspace

Souvik Saha Choudhary 21 Reputation points
2023-05-26T11:58:57.2633333+00:00

I am running an application in AKS. The application generates multiple log files.

I want all these log details in Log Analytics Workspace.

How can I implement it?

Azure Kubernetes Service (AKS)
Azure Kubernetes Service (AKS)
An Azure service that provides serverless Kubernetes, an integrated continuous integration and continuous delivery experience, and enterprise-grade security and governance.
1,977 questions
{count} votes

Accepted answer
  1. vipullag-MSFT 25,861 Reputation points
    2023-05-30T08:45:07.7333333+00:00

    Hello Souvik Saha Choudhary

    To collect logs from your ColdFusion application running in AKS and send them to a Log Analytics workspace, you can use the Azure Monitor for containers solution.

    Azure Monitor for containers can collect logs from the container's stdout and stderr streams, as well as from application logs written to the file system. To collect logs from the file system, you can use a Fluentd or Logstash daemonset to tail the log files and send them to Azure Monitor for containers.

    Here are the high-level steps to configure Azure Monitor for containers to collect logs from your ColdFusion application:

    1. Deploy a Fluentd or Logstash daemonset to your AKS cluster. You can use a pre-built image or create your own image that includes the necessary plugins to tail the log files.
    2. Configure the Fluentd or Logstash daemonset to tail the log files and send them to Azure Monitor for containers. You can use the Azure Monitor for containers Fluentd or Logstash plugin to send the logs to Azure Monitor.
    3. Enable monitoring of your AKS cluster in the Azure portal from Azure Monitor.
    4. Verify that logs from your ColdFusion application are being collected in the Log Analytics workspace.

    Hope that helps.


2 additional answers

Sort by: Most helpful
  1. AirGordon 7,030 Reputation points
    2023-05-26T12:35:34.4466667+00:00

    Two options to deal with log files,

    1. Use the Log Ingestion API either in your code or by another process.
    2. Change the way your app is logging to put the data to stdOut which will get automatically picked up by Container Insights in AKS

  2. Andrei Barbu 2,581 Reputation points Microsoft Employee
    2023-05-26T14:58:26.9033333+00:00

    Hello Souvik Saha Choudhary

    I am just adding a few more details on top of AirGordon's answer.

    I would recommend enabling monitoring add-on (also known as Container Insights). You will need to provide a Log Analytics Workspace (or use the default one) and there the logs of your pods/containers part of the AKS cluster will be stored.

    Then you can use and adapt the following query to get the logs per pod and namespace:

    let startTimestamp = ago(1h);
    KubePodInventory
    | where TimeGenerated > startTimestamp
    | project ContainerID, PodName=Name, Namespace
    | where PodName contains "name" and Namespace startswith "namespace"
    | distinct ContainerID, PodName
    | join
    (
        ContainerLog
        | where TimeGenerated > startTimestamp
    )
    on ContainerID
    // at this point before the next pipe, columns from both tables are available to be "projected". Due to both
    // tables having a "Name" column, we assign an alias as PodName to one column which we actually want
    | project TimeGenerated, PodName, LogEntry, LogEntrySource
    | summarize by TimeGenerated, LogEntry
    | order by TimeGenerated desc
    

    Reference link for the query: https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-log-query#pods-by-name-and-namespace

    Hopefully this is what you are looking for! If you have additional questions, please let us know in the comments.

    If this has been helpful, please take a moment to accept answers as this helps increase visibility of this question for other members of the Microsoft Q&A community. Thank you for helping to improve Microsoft Q&A!

    User's image