Azure Monitor OpenTelemetry Exporter for Python. Use for low-level OpenTelemetry export to Application Insights.
✓Works with OpenClaudeLow-level exporter for sending OpenTelemetry traces, metrics, and logs to Application Insights.
Installation
pip install azure-monitor-opentelemetry-exporter
Environment Variables
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=xxx;IngestionEndpoint=https://xxx.in.applicationinsights.azure.com/
When to Use
| Scenario | Use |
|---|---|
| Quick setup, auto-instrumentation | azure-monitor-opentelemetry (distro) |
| Custom OpenTelemetry pipeline | azure-monitor-opentelemetry-exporter (this) |
| Fine-grained control over telemetry | azure-monitor-opentelemetry-exporter (this) |
Trace Exporter
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
# Create exporter
exporter = AzureMonitorTraceExporter(
connection_string="InstrumentationKey=xxx;..."
)
# Configure tracer provider
trace.set_tracer_provider(TracerProvider())
trace.get_tracer_provider().add_span_processor(
BatchSpanProcessor(exporter)
)
# Use tracer
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("my-span"):
print("Hello, World!")
Metric Exporter
from opentelemetry import metrics
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from azure.monitor.opentelemetry.exporter import AzureMonitorMetricExporter
# Create exporter
exporter = AzureMonitorMetricExporter(
connection_string="InstrumentationKey=xxx;..."
)
# Configure meter provider
reader = PeriodicExportingMetricReader(exporter, export_interval_millis=60000)
metrics.set_meter_provider(MeterProvider(metric_readers=[reader]))
# Use meter
meter = metrics.get_meter(__name__)
counter = meter.create_counter("requests_total")
counter.add(1, {"route": "/api/users"})
Log Exporter
import logging
from opentelemetry._logs import set_logger_provider
from opentelemetry.sdk._logs import LoggerProvider, LoggingHandler
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from azure.monitor.opentelemetry.exporter import AzureMonitorLogExporter
# Create exporter
exporter = AzureMonitorLogExporter(
connection_string="InstrumentationKey=xxx;..."
)
# Configure logger provider
logger_provider = LoggerProvider()
logger_provider.add_log_record_processor(BatchLogRecordProcessor(exporter))
set_logger_provider(logger_provider)
# Add handler to Python logging
handler = LoggingHandler(level=logging.INFO, logger_provider=logger_provider)
logging.getLogger().addHandler(handler)
# Use logging
logger = logging.getLogger(__name__)
logger.info("This will be sent to Application Insights")
From Environment Variable
Exporters read APPLICATIONINSIGHTS_CONNECTION_STRING automatically:
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
# Connection string from environment
exporter = AzureMonitorTraceExporter()
Azure AD Authentication
from azure.identity import DefaultAzureCredential
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
exporter = AzureMonitorTraceExporter(
credential=DefaultAzureCredential()
)
Sampling
Use ApplicationInsightsSampler for consistent sampling:
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.sampling import ParentBasedTraceIdRatio
from azure.monitor.opentelemetry.exporter import ApplicationInsightsSampler
# Sample 10% of traces
sampler = ApplicationInsightsSampler(sampling_ratio=0.1)
trace.set_tracer_provider(TracerProvider(sampler=sampler))
Offline Storage
Configure offline storage for retry:
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
exporter = AzureMonitorTraceExporter(
connection_string="...",
storage_directory="/path/to/storage", # Custom storage path
disable_offline_storage=False # Enable retry (default)
)
Disable Offline Storage
exporter = AzureMonitorTraceExporter(
connection_string="...",
disable_offline_storage=True # No retry on failure
)
Sovereign Clouds
from azure.identity import AzureAuthorityHosts, DefaultAzureCredential
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
# Azure Government
credential = DefaultAzureCredential(authority=AzureAuthorityHosts.AZURE_GOVERNMENT)
exporter = AzureMonitorTraceExporter(
connection_string="InstrumentationKey=xxx;IngestionEndpoint=https://xxx.in.applicationinsights.azure.us/",
credential=credential
)
Exporter Types
| Exporter | Telemetry Type | Application Insights Table |
|---|---|---|
AzureMonitorTraceExporter | Traces/Spans | requests, dependencies, exceptions |
AzureMonitorMetricExporter | Metrics | customMetrics, performanceCounters |
AzureMonitorLogExporter | Logs | traces, customEvents |
Configuration Options
| Parameter | Description | Default |
|---|---|---|
connection_string | Application Insights connection string | From env var |
credential | Azure credential for AAD auth | None |
disable_offline_storage | Disable retry storage | False |
storage_directory | Custom storage path | Temp directory |
Best Practices
- Use BatchSpanProcessor for production (not SimpleSpanProcessor)
- Use ApplicationInsightsSampler for consistent sampling across services
- Enable offline storage for reliability in production
- Use AAD authentication instead of instrumentation keys
- Set export intervals appropriate for your workload
- Use the distro (
azure-monitor-opentelemetry) unless you need custom pipelines
Related Cloud (AWS/GCP/Azure) Skills
Other Claude Code skills in the same category — free to download.
Lambda Function
Create AWS Lambda function with handler
S3 Operations
Set up S3 bucket operations (upload, download, presigned URLs)
DynamoDB CRUD
Create DynamoDB CRUD operations
SQS Setup
Set up SQS queue producer and consumer
SNS Notifications
Configure SNS for push notifications
CloudFront Setup
Set up CloudFront CDN distribution
Cognito Auth
Implement AWS Cognito authentication
RDS Setup
Configure RDS database connection
Want a Cloud (AWS/GCP/Azure) skill personalized to YOUR project?
This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.