Azure Storage Analytics logging (2024)

  • Article

Storage Analytics logs detailed information about successful and failed requests to a storage service. This information can be used to monitor individual requests and to diagnose issues with a storage service. Requests are logged on a best-effort basis. This means that most requests will result in a log record, but the completeness and timeliness of Storage Analytics logs are not guaranteed.

Note

We recommend that you use Azure Storage logs in Azure Monitor instead of Storage Analytics logs. To learn more, see any of the following articles:

  • Monitoring Azure Blob Storage
  • Monitoring Azure Files
  • Monitoring Azure Queue Storage
  • Monitoring Azure Table storage

Storage Analytics logging is not enabled by default for your storage account. You can enable it in the Azure portal or by using PowerShell, or Azure CLI. For step-by-step guidance, see Enable and manage Azure Storage Analytics logs (classic).

You can also enable Storage Analytics logs programmatically via the REST API or the client library. Use the Get Blob Service Properties, Get Queue Service Properties, and Get Table Service Properties operations to enable Storage Analytics for each service. To see an example that enables Storage Analytics logs by using .NET, see Enable logs

Log entries are created only if there are requests made against the service endpoint. For example, if a storage account has activity in its Blob endpoint but not in its Table or Queue endpoints, only logs pertaining to the Blob service will be created.

Note

Storage Analytics logging is currently available only for the Blob, Queue, and Table services. Storage Analytics logging is also available for premium-performance BlockBlobStorage accounts. However, it isn't available for general-purpose v2 accounts with premium performance.

Requests logged in logging

Logging authenticated requests

The following types of authenticated requests are logged:

Logging anonymous requests

The following types of anonymous requests are logged:

  • Successful requests

  • Server errors

  • Timeout errors for both client and server

  • Failed GET requests with error code 304 (Not Modified)

    All other failed anonymous requests are not logged. A full list of the logged data is documented in the Storage Analytics Logged Operations and Status Messages and Storage Analytics Log Format topics.

Note

Storage Analytics logs all internal calls to the data plane. Calls from the Azure Storage Resource Provider are also logged. To identify these requests, look for the query string <sk=system-1> in the request URL.

How logs are stored

All logs are stored in block blobs in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://<accountname>.blob.core.windows.net/$logs. This container cannot be deleted once Storage Analytics has been enabled, though its contents can be deleted. If you use your storage-browsing tool to navigate to the container directly, you will see all the blobs that contain your logging data.

Note

The $logs container is not displayed when a container listing operation is performed, such as the List Containers operation. It must be accessed directly. For example, you can use the List Blobs operation to access the blobs in the $logs container.

As requests are logged, Storage Analytics will upload intermediate results as blocks. Periodically, Storage Analytics will commit these blocks and make them available as a blob. It can take up to an hour for log data to appear in the blobs in the $logs container because the frequency at which the storage service flushes the log writers. Duplicate records may exist for logs created in the same hour. You can determine if a record is a duplicate by checking the RequestId and Operation number.

If you have a high volume of log data with multiple files for each hour, then you can use the blob metadata to determine what data the log contains by examining the blob metadata fields. This is also useful because there can sometimes be a delay while data is written to the log files: the blob metadata gives a more accurate indication of the blob content than the blob name.

Most storage browsing tools enable you to view the metadata of blobs; you can also read this information using PowerShell or programmatically. The following PowerShell snippet is an example of filtering the list of log blobs by name to specify a time, and by metadata to identify just those logs that contain write operations.

Get-AzStorageBlob -Container '$logs' | Where-Object { $_.Name -match 'blob/2014/05/21/05' -and $_.ICloudBlob.Metadata.LogType -match 'write' } | ForEach-Object { "{0} {1} {2} {3}" –f $_.Name, $_.ICloudBlob.Metadata.StartTime, $_.ICloudBlob.Metadata.EndTime, $_.ICloudBlob.Metadata.LogType } 

For information about listing blobs programmatically, see Enumerating Blob Resources and Setting and Retrieving Properties and Metadata for Blob Resources.

Log naming conventions

Each log will be written in the following format:

<service-name>/YYYY/MM/DD/hhmm/<counter>.log

The following table describes each attribute in the log name:

AttributeDescription
<service-name>The name of the storage service. For example: blob, table, or queue
YYYYThe four digit year for the log. For example: 2011
MMThe two digit month for the log. For example: 07
DDThe two digit day for the log. For example: 31
hhThe two digit hour that indicates the starting hour for the logs, in 24 hour UTC format. For example: 18
mmThe two digit number that indicates the starting minute for the logs. Note: This value is unsupported in the current version of Storage Analytics, and its value will always be 00.
<counter>A zero-based counter with six digits that indicates the number of log blobs generated for the storage service in an hour time period. This counter starts at 000000. For example: 000001

The following is a complete sample log name that combines the above examples:

blob/2011/07/31/1800/000001.log

The following is a sample URI that can be used to access the above log:

https://<accountname>.blob.core.windows.net/$logs/blob/2011/07/31/1800/000001.log

When a storage request is logged, the resulting log name correlates to the hour when the requested operation completed. For example, if a GetBlob request was completed at 6:30PM on 7/31/2011, the log would be written with the following prefix: blob/2011/07/31/1800/

Log metadata

All log blobs are stored with metadata that can be used to identify what logging data the blob contains. The following table describes each metadata attribute:

AttributeDescription
LogTypeDescribes whether the log contains information pertaining to read, write, or delete operations. This value can include one type or a combination of all three, separated by commas.

Example 1: write

Example 2: read,write

Example 3: read,write,delete

StartTimeThe earliest time of an entry in the log, in the form of YYYY-MM-DDThh:mm:ssZ. For example: 2011-07-31T18:21:46Z
EndTimeThe latest time of an entry in the log, in the form of YYYY-MM-DDThh:mm:ssZ. For example: 2011-07-31T18:22:09Z
LogVersionThe version of the log format.

The following list displays complete sample metadata using the above examples:

  • LogType=write
  • StartTime=2011-07-31T18:21:46Z
  • EndTime=2011-07-31T18:22:09Z
  • LogVersion=1.0

Log entries

The following sections show an example log entry for each supported Azure Storage service.

Example log entry for Blob Storage

2.0;2022-01-03T20:34:54.4617505Z;PutBlob;SASSuccess;201;7;7;sas;;logsamples;blob;https://logsamples.blob.core.windows.net/container1/1.txt?se=2022-02-02T20:34:54Z&amp;sig=XXXXX&amp;sp=rwl&amp;sr=c&amp;sv=2020-04-08&amp;timeout=901;"/logsamples/container1/1.txt";xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx;0;71.197.193.44:53371;2019-12-12;654;13;337;0;13;"xxxxxxxxxxxxxxxxxxxxx==";"xxxxxxxxxxxxxxxxxxxxx==";"&quot;0x8D9CEF88004E296&quot;";Monday, 03-Jan-22 20:34:54 GMT;;"Microsoft Azure Storage Explorer, 1.20.1, win32, azcopy-node, 2.0.0, win32, AzCopy/10.11.0 Azure-Storage/0.13 (go1.15; Windows_NT)";;"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx";;;;;;;;

Example log entry for Blob Storage (Data Lake Storage Gen2 enabled)

2.0;2022-01-04T22:50:56.0000775Z;RenamePathFile;Success;201;49;49;authenticated;logsamples;logsamples;blob;"https://logsamples.dfs.core.windows.net/my-container/myfileorig.png?mode=legacy";"/logsamples/my-container/myfilerenamed.png";xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx;0;73.157.16.8;2020-04-08;591;0;224;0;0;;;;Friday, 11-Jun-21 17:58:15 GMT;;"Microsoft Azure Storage Explorer, 1.19.1, win32 azsdk-js-storagedatalake/12.3.1 (NODE-VERSION v12.16.3; Windows_NT 10.0.22000)";;"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx";;;;;;;;

Example log entry for Queue Storage

2.0;2022-01-03T20:35:04.6097590Z;PeekMessages;Success;200;5;5;authenticated;logsamples;logsamples;queue;https://logsamples.queue.core.windows.net/queue1/messages?numofmessages=32&amp;peekonly=true&amp;timeout=30;"/logsamples/queue1";xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx;0;71.197.193.44:53385;2020-04-08;536;0;232;62;0;;;;;;"Microsoft Azure Storage Explorer, 1.20.1, win32 azsdk-js-storagequeue/12.3.1 (NODE-VERSION v12.16.3; Windows_NT 10.0.22000)";;"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx";;;;;;;;

Example log entry for Table Storage

1.0;2022-01-03T20:35:13.0719766Z;CreateTable;Success;204;30;30;authenticated;logsamples;logsamples;table;https://logsamples.table.core.windows.net/Tables;"/logsamples/Table1";xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx;0;71.197.193.44:53389;2018-03-28;601;22;339;0;22;;;;;;"Microsoft Azure Storage Explorer, 1.20.1, win32, Azure-Storage/2.10.3 (NODE-VERSION v12.16.3; Windows_NT 10.0.22000)";;"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx"

Next steps

  • Enable and manage Azure Storage Analytics logs (classic)
  • Storage Analytics Log Format
  • Storage Analytics Logged Operations and Status Messages
  • Storage Analytics Metrics (classic)

As a seasoned expert in cloud storage services, particularly Azure Storage, I can assure you that I possess comprehensive knowledge and hands-on experience in the field. My expertise extends to the intricacies of Storage Analytics, a powerful tool for monitoring and diagnosing issues related to successful and failed requests in a storage service.

The article you provided delves into the details of Storage Analytics, shedding light on various aspects of logging, monitoring, and managing Azure Storage. Let's break down the key concepts discussed in the article:

  1. Purpose of Storage Analytics:

    • Storage Analytics logs detailed information about both successful and failed requests made to a storage service.
    • It serves the purpose of monitoring individual requests and diagnosing issues within the storage service.
  2. Logging Best Practices:

    • Requests are logged on a best-effort basis, meaning that most requests result in a log record, but completeness and timeliness are not guaranteed.
  3. Recommendation for Azure Monitor:

    • The article recommends using Azure Storage logs in Azure Monitor instead of Storage Analytics logs.
  4. Enabling Storage Analytics:

    • Storage Analytics logging is not enabled by default for a storage account.
    • Users can enable it through the Azure portal, PowerShell, Azure CLI, REST API, or the client library.
  5. Logged Requests:

    • Authenticated requests (successful and failed) are logged, including those using Shared Access Signatures (SAS) or OAuth.
    • Anonymous requests, such as successful requests, server errors, and timeout errors, are also logged.
  6. Services Covered:

    • Storage Analytics logging is available for Blob, Queue, and Table services.
    • It is also available for premium-performance BlockBlobStorage accounts.
  7. Log Storage and Naming:

    • Logs are stored in block blobs within a container named $logs.
    • The $logs container is created automatically when Storage Analytics is enabled.
    • Log names follow a specific format, indicating the service, date, and counter.
  8. Metadata and Log Content:

    • Log blobs are stored with metadata describing log type, start time, end time, and log version.
    • The metadata provides information about the content of the log.
  9. Log Entry Examples:

    • The article provides examples of log entries for Blob Storage, Blob Storage with Data Lake Storage Gen2, Queue Storage, and Table Storage.
    • Each log entry includes details such as timestamp, operation type, success status, and more.
  10. Log Retrieval and Analysis:

    • Logs are accessed through the $logs container in the blob namespace.
    • Storage Analytics may upload intermediate results as blocks before committing them as blobs.
    • Log data may take up to an hour to appear, and duplicate records may exist.
  11. Log Naming Conventions:

    • Log names follow a specific convention, including service name, date, hour, and a counter.

The provided PowerShell snippet demonstrates how to filter log blobs based on name and metadata to extract specific log information.

In conclusion, Storage Analytics plays a crucial role in monitoring and managing Azure Storage, providing valuable insights into the performance and health of storage services. Users can leverage this information to identify issues, optimize performance, and ensure the reliability of their storage infrastructure. If you have any specific questions or need further clarification on certain aspects, feel free to ask.

Azure Storage Analytics logging (2024)

FAQs

How to enable Azure storage analytics logging? ›

  1. In the Azure portal, select Storage accounts, then the name of the storage account to open the storage account blade.
  2. Select Diagnostic settings (classic) in the Monitoring (classic) section of the menu blade.
  3. Ensure Status is set to On, and select the services for which you'd like to enable logging.
Apr 5, 2023

Which of the following is captured by storage analytics logging? ›

Storage Analytics Logging captures details for both successful and failed requests, including both authenticated and anonymous calls. Successful requests for authenticated calls only: Partially correct, as Storage Analytics Logging captures successful requests for both authenticated and anonymous calls.

What Azure tools can be used to store and analyze logs? ›

The Log Analytics user interface in the Azure portal helps you query the log data collected by Azure Monitor so that you can quickly retrieve, consolidate, and analyze collected data.

How do I enable Log Analytics? ›

Set up Log Analytics for an existing service
  1. In the Azure portal, go to the Diagnostic settings section under Monitoring.
  2. If no settings exist, select Add diagnostic setting. You can also select Edit setting to update existing settings.
  3. Fill out the form on the Diagnostic setting page: ...
  4. Select Save.
Jun 7, 2024

How to enable logging in Azure? ›

To enable web server logging for Windows apps in the Azure portal, navigate to your app and select App Service logs. For Web server logging, select Storage to store logs on blob storage, or File System to store logs on the App Service file system.

Where is Azure Log Analytics stored? ›

The answer is Azure Log Analytics Workspace. An Azure Log Analytics Workspace is a logical storage unit in Azure where all log data generated by Azure Monitors are stored. Azure Log Analytics Workspace makes it easier for us to manage the log data that is collected from various data sources like Azure Virtual Machines.

What is storage logging? ›

Storage logs provide information about the storage consumption of that bucket for the last day and are created daily. Once set up, usage logs and storage logs are automatically created as new objects in a bucket that you specify.

Which type of Azure storage should you use to store logs? ›

1 Answer. You can store logs in the Azure table storage or Blob storage but Microsoft itself recommends using Blob storage. Azure Storage Analytics stores log data in Blob storage.

Where is captured log data stored in Azure? ›

All logs are stored in block blobs in a container named $logs , which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://<accountname>.blob.core.windows.net/$logs .

Where is the log data that is viewed by Azure Monitor stored? ›

Logs in Azure Monitor are stored in a Log Analytics workspace that's based on Azure Data Explorer, which provides a powerful analysis engine and rich query language.

How do I read data from Azure Log Analytics? ›

Go to the Log Analytics dashboard by clicking on the Dashboard button in the top menu. On the dashboard page, click on the Tables tab in the left menu. This will display a list of all of the tables in the demo environment. To view the contents of a table, click on the name of the table in the list.

What is the difference between Azure Monitor and Azure Log Analytics? ›

In conclusion, Azure Monitor and Log Analytics collectively offer a robust solution for monitoring Azure resources. While Azure Monitor provides a lot of features including aggregation of logs, real-time insights and performance metrics, Log Analytics allows advanced query capabilities and extensive log data analysis.

How does Azure Log Analytics work? ›

Log Analytics is a tool in the Azure portal to edit and run log queries from data collected by Azure Monitor logs and interactively analyze their results. You can use Log Analytics queries to retrieve records that match particular criteria, identify trends, analyze patterns, and provide various insights into your data.

How to check Azure Log Analytics? ›

View insights for a Log Analytics workspace
  1. In the Azure portal, select Log Analytics Workspaces.
  2. Choose a Log Analytics workspace.
  3. Under Monitoring, select Insights on the workspace menu.
Dec 28, 2023

How do I add Azure Log Analytics? ›

Configure Log Analytics
  1. Sign in to the Azure portal as at least a Security Administrator and Log Analytics Contributor.
  2. Browse to Log Analytics workspaces.
  3. Select Create.
  4. On the Create Log Analytics workspace page, perform the following steps: ...
  5. Select Review + Create.
  6. Select Create and wait for the deployment.
Feb 9, 2024

How do I enable Log Analytics in Azure data Factory? ›

In this article
  1. In the Azure portal, navigate to your data factory and select Diagnostics on the left navigation pane to see the diagnostics settings. ...
  2. Give your setting a name, select Send to Log Analytics, and then select a workspace from Log Analytics workspace. ...
  3. Select Save.
Jun 21, 2024

How do I see the logs of my Azure storage account? ›

All logs are stored in block blobs in a container named $logs , which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://<accountname>.blob.core.windows.net/$logs .

Top Articles
103 Ways to Make Extra Money in 2024
20 Simple Ways to Save Money For Travel
Lowe's Garden Fence Roll
Terrorist Usually Avoid Tourist Locations
News - Rachel Stevens at RachelStevens.com
Obituary (Binghamton Press & Sun-Bulletin): Tully Area Historical Society
Sotyktu Pronounce
How Quickly Do I Lose My Bike Fitness?
104 Presidential Ct Lafayette La 70503
UEQ - User Experience Questionnaire: UX Testing schnell und einfach
Labor Gigs On Craigslist
Guilford County | NCpedia
Kitty Piggy Ssbbw
Xxn Abbreviation List 2023
111 Cubic Inch To Cc
Curry Ford Accident Today
FDA Approves Arcutis’ ZORYVE® (roflumilast) Topical Foam, 0.3% for the Treatment of Seborrheic Dermatitis in Individuals Aged 9 Years and Older - Arcutis Biotherapeutics
Allentown Craigslist Heavy Equipment
Acts 16 Nkjv
Sussyclassroom
Dragonvale Valor Dragon
Craigslistodessa
The Boogeyman (Film, 2023) - MovieMeter.nl
Enduring Word John 15
Angel Haynes Dropbox
My Reading Manga Gay
2487872771
Alima Becker
Homewatch Caregivers Salary
Pch Sunken Treasures
Baldur's Gate 3 Dislocated Shoulder
Rust Belt Revival Auctions
ShadowCat - Forestry Mulching, Land Clearing, Bush Hog, Brush, Bobcat - farm & garden services - craigslist
SOC 100 ONL Syllabus
KM to M (Kilometer to Meter) Converter, 1 km is 1000 m
Compare Plans and Pricing - MEGA
Publictributes
Qlima© Petroleumofen Elektronischer Laserofen SRE 9046 TC mit 4,7 KW CO2 Wächter • EUR 425,95
Restored Republic June 6 2023
Carroll White Remc Outage Map
Actor and beloved baritone James Earl Jones dies at 93
Winta Zesu Net Worth
Rush Copley Swim Lessons
Watch Chainsaw Man English Sub/Dub online Free on HiAnime.to
Comanche Or Crow Crossword Clue
Alba Baptista Bikini, Ethnicity, Marriage, Wedding, Father, Shower, Nazi
Perc H965I With Rear Load Bracket
Europa Universalis 4: Army Composition Guide
Aznchikz
Hcs Smartfind
Access One Ummc
Latest Posts
Article information

Author: Terence Hammes MD

Last Updated:

Views: 5956

Rating: 4.9 / 5 (69 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Terence Hammes MD

Birthday: 1992-04-11

Address: Suite 408 9446 Mercy Mews, West Roxie, CT 04904

Phone: +50312511349175

Job: Product Consulting Liaison

Hobby: Jogging, Motor sports, Nordic skating, Jigsaw puzzles, Bird watching, Nordic skating, Sculpting

Introduction: My name is Terence Hammes MD, I am a inexpensive, energetic, jolly, faithful, cheerful, proud, rich person who loves writing and wants to share my knowledge and understanding with you.