title | description | ms.topic | ms.date | ms.custom |
---|---|---|---|---|
Continuous export of telemetry from Application Insights | Microsoft Docs |
Export diagnostic and usage data to storage in Microsoft Azure, and download it from there. |
conceptual |
02/19/2021 |
references_regions |
Want to keep your telemetry for longer than the standard retention period? Or process it in some specialized way? Continuous Export is ideal for this. The events you see in the Application Insights portal can be exported to storage in Microsoft Azure in JSON format. From there, you can download your data and write whatever code you need to process it.
Important
Continuous export has been deprecated. Migrate to a workspace-based Application Insights resource to use diagnostic settings for exporting telemetry.
Note
Continuous export is only supported for classic Application Insights resources. Workspace-based Application Insights resources must use diagnostic settings.
Before you set up continuous export, there are some alternatives you might want to consider:
-
The Export button at the top of a metrics or search tab lets you transfer tables and charts to an Excel spreadsheet.
-
Analytics provides a powerful query language for telemetry. It can also export results.
-
If you're looking to explore your data in Power BI, you can do that without using Continuous Export.
-
The Data access REST API lets you access your telemetry programmatically.
-
You can also access setup continuous export via PowerShell.
After Continuous Export copies your data to storage (where it can stay for as long as you like), it's still available in Application Insights for the usual retention period.
Continuous Export is supported in the following regions:
- Southeast Asia
- Canada Central
- Central India
- North Europe
- UK South
- Australia East
- Japan East
- Korea Central
- France Central
- East Asia
- West US
- Central US
- East US 2
- South Central US
- West US 2
- South Africa North
- North Central US
- Brazil South
- Switzerland North
- Australia Southeast
- UK West
- Germany West Central
- Switzerland West
- Australia Central 2
- UAE Central
- Brazil Southeast
- Australia Central
- UAE North
- Norway East
- Japan West
Note
Continuous Export will continue to work for Applications in East US and West Europe if the export was configured before February 23, 2021. New Continuous Export rules cannot be configured on any application in East US or West Europe, regardless of when the application was created.
Continuous Export does not support the following Azure storage features/configurations:
-
Use of VNET/Azure Storage firewalls in conjunction with Azure Blob storage.
Note
An application cannot export more than 3TB of data per day. If more than 3TB per day is exported, the export will be disabled. To export without a limit use diagnostic settings based export.
-
In the Application Insights resource for your app under configure on the left, open Continuous Export and choose Add:
-
Choose the telemetry data types you want to export.
-
Create or select an Azure storage account where you want to store the data. For more information on storage pricing options, visit the official pricing page.
Click Add, Export Destination, Storage account, and then either create a new store or choose an existing store.
[!Warning] By default, the storage location will be set to the same geographical region as your Application Insights resource. If you store in a different region, you may incur transfer charges.
-
Create or select a container in the storage.
Note
Once you've created your export, newly ingested data will begin to flow to Azure Blob storage. Continuous export will only transmit new telemetry that is created/ingested after continuous export was enabled. Any data that existed prior to enabling continuous export will not be exported, and there is no supported way to retroactively export previously created data using continuous export.
There can be a delay of about an hour before data appears in the storage.
Once the first export is complete you will find a structure similar to the following in your Azure Blob storage container: (This will vary depending on the data you are collecting.)
Name | Description |
---|---|
Availability | Reports availability web tests. |
Event | Custom events generated by TrackEvent(). |
Exceptions | Reports exceptions in the server and in the browser. |
Messages | Sent by TrackTrace, and by the logging adapters. |
Metrics | Generated by metric API calls. |
PerformanceCounters | Performance Counters collected by Application Insights. |
Requests | Sent by TrackRequest. The standard modules use this to reports server response time, measured at the server. |
Click on continuous export and select the storage account to edit.
To stop the export, click Disable. When you click Enable again, the export will restart with new data. You won't get the data that arrived in the portal while export was disabled.
To stop the export permanently, delete it. Doing so doesn't delete your data from storage.
- To add or change exports, you need Owner, Contributor, or Application Insights Contributor access rights. Learn about roles.
The exported data is the raw telemetry we receive from your application, except that we add location data, which we calculate from the client IP address.
Data that has been discarded by sampling is not included in the exported data.
Other calculated metrics are not included. For example, we don't export average CPU utilization, but we do export the raw telemetry from which the average is computed.
The data also includes the results of any availability web tests that you have set up.
Note
Sampling. If your application sends a lot of data, the sampling feature may operate and send only a fraction of the generated telemetry. Learn more about sampling.
You can inspect the storage directly in the portal. Click home in the leftmost menu, at the top where it says "Azure services" select Storage accounts, select the storage account name, on the overview page select Blobs under services, and finally select the container name.
To inspect Azure storage in Visual Studio, open View, Cloud Explorer. (If you don't have that menu command, you need to install the Azure SDK: Open the New Project dialog, expand Visual C#/Cloud and choose Get Microsoft Azure SDK for .NET.)
When you open your blob store, you'll see a container with a set of blob files. The URI of each file derived from your Application Insights resource name, its instrumentation key, telemetry-type/date/time. (The resource name is all lowercase, and the instrumentation key omits dashes.)
The date and time are UTC and are when the telemetry was deposited in the store - not the time it was generated. So if you write code to download the data, it can move linearly through the data.
Here's the form of the path:
$"{applicationName}_{instrumentationKey}/{type}/{blobDeliveryTimeUtc:yyyy-MM-dd}/{ blobDeliveryTimeUtc:HH}/{blobId}_{blobCreationTimeUtc:yyyyMMdd_HHmmss}.blob"
Where
blobCreationTimeUtc
is the time when blob was created in the internal staging storageblobDeliveryTimeUtc
is the time when blob is copied to the export destination storage
-
Each blob is a text file that contains multiple '\n'-separated rows. It contains the telemetry processed over a time period of roughly half a minute.
-
Each row represents a telemetry data point such as a request or page view.
-
Each row is an unformatted JSON document. If you want to view the rows, open the blob in Visual Studio and choose Edit > Advanced > Format File:
Time durations are in ticks, where 10 000 ticks = 1 ms. For example, these values show a time of 1 ms to send a request from the browser, 3 ms to receive it, and 1.8 s to process the page in the browser:
"sendRequest": {"value": 10000.0},
"receiveRequest": {"value": 30000.0},
"clientProcess": {"value": 17970000.0}
Detailed data model reference for the property types and values.
On a small scale, you can write some code to pull apart your data, read it into a spreadsheet, and so on. For example:
private IEnumerable<T> DeserializeMany<T>(string folderName)
{
var files = Directory.EnumerateFiles(folderName, "*.blob", SearchOption.AllDirectories);
foreach (var file in files)
{
using (var fileReader = File.OpenText(file))
{
string fileContent = fileReader.ReadToEnd();
IEnumerable<string> entities = fileContent.Split('\n').Where(s => !string.IsNullOrWhiteSpace(s));
foreach (var entity in entities)
{
yield return JsonConvert.DeserializeObject<T>(entity);
}
}
}
}
For a larger code sample, see using a worker role.
You are responsible for managing your storage capacity and deleting the old data if necessary.
If you change the key to your storage, continuous export will stop working. You'll see a notification in your Azure account.
Open the Continuous Export tab and edit your export. Edit the Export Destination, but just leave the same storage selected. Click OK to confirm.
The continuous export will restart.
On larger scales, consider HDInsight - Hadoop clusters in the cloud. HDInsight provides a variety of technologies for managing and analyzing big data, and you could use it to process data that has been exported from Application Insights.
-
But all I want is a one-time download of a chart.
Yes, you can do that. At the top of the tab, click Export Data.
-
I set up an export, but there's no data in my store.
Did Application Insights receive any telemetry from your app since you set up the export? You'll only receive new data.
-
I tried to set up an export, but was denied access
If the account is owned by your organization, you have to be a member of the owners or contributors groups.
-
Can I export straight to my own on-premises store?
No, sorry. Our export engine currently only works with Azure storage at this time.
-
Is there any limit to the amount of data you put in my store?
No. We'll keep pushing data in until you delete the export. We'll stop if we hit the outer limits for blob storage, but that's pretty huge. It's up to you to control how much storage you use.
-
How many blobs should I see in the storage?
- For every data type you selected to export, a new blob is created every minute (if data is available).
- In addition, for applications with high traffic, additional partition units are allocated. In this case, each unit creates a blob every minute.
-
I regenerated the key to my storage or changed the name of the container, and now the export doesn't work.
Edit the export and open the export destination tab. Leave the same storage selected as before, and click OK to confirm. Export will restart. If the change was within the past few days, you won't lose data.
-
Can I pause the export?
Yes. Click Disable.
- Stream Analytics sample
- Export to SQL using Stream Analytics
- Detailed data model reference for the property types and values.
Diagnostic settings based export uses a different schema than continuous export. It also supports features that continuous export does not like:
- Azure storage accounts with vnet, firewalls, and private links.
- Export to event hub.
To migrate to diagnostic settings based export:
- Disable current continuous export.
- Migrate application to workspace-based.
- Enable diagnostic settings export. Select Diagnostic settings > add diagnostic setting from within your Application Insights resource.