Skip to main content
Version: 3.x

OpenTelemetry instrumentation

In this section, we will explore how to enable OpenTelemetry instrumentation when using KafkaFlow.

KafkaFlow includes support for Traces and Baggage signals using OpenTelemetry instrumentation.

tip

You can find a sample on how to enable OpenTelemetry here.

Including OpenTelemetry instrumentation in your code

Add the package KafkaFlow.OpenTelemetry to the project and add the extension method AddOpenTelemetryInstrumentation in your configuration:

services.AddKafka(
kafka => kafka
.AddCluster(...)
.AddOpenTelemetryInstrumentation()
);

Once you have your .NET application instrumentation configured (see here), the KafkaFlow activity can be captured by adding the source KafkaFlowInstrumentation.ActivitySourceName in the tracer provider builder, e.g.:

 using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource(KafkaFlowInstrumentation.ActivitySourceName)
...

Advanced Configuration

The instrumentation can be configured to change the default behavior by using KafkaFlowInstrumentationOptions.

Enrich

This option can be used to enrich the Activity with additional information from IMessageContext object. It defines separate methods for producer and consumer enrich:

services.AddKafka(
kafka => kafka
.AddCluster(...)
.AddOpenTelemetryInstrumentation(options =>
{
options.EnrichProducer = (activity, messageContext) =>
{
activity.SetTag("messaging.destination.producername", "KafkaFlowOtel");
};

options.EnrichConsumer = (activity, messageContext) =>
{
activity.SetTag("messaging.destination.group.id", messageContext.ConsumerContext.GroupId);
};
})
);

Using .NET Automatic Instrumentation

When using .NET automatic instrumentation, the KafkaFlow activity can be captured by including the ActivitySource name KafkaFlow.OpenTelemetry as a parameter to the variable OTEL_DOTNET_AUTO_TRACES_ADDITIONAL_SOURCES.

Propagation

KafkaFlow uses Propagation, the mechanism that moves context information data between services and processes. When a message is produced using a KafkaFlow producer and consumed by a KafkaFlow consumer, the context will automatically be propagated.