Logo

dev-resources.site

for different kinds of informations.

OpenTelemetry Tracing on Spring Boot, Java Agent vs. Micrometer Tracing

Published at
8/8/2024
Categories
opentelemetry
springboot
tracing
kotlin
Author
nfrankel
Author
8 person written this
nfrankel
open
OpenTelemetry Tracing on Spring Boot, Java Agent vs. Micrometer Tracing

My demo of OpenTelemetry Tracing features two Spring Boot components. One uses the Java agent, and I noticed a different behavior when I recently upgraded it from v1.x to v2.x. In the other one, I'm using Micrometer Tracing because I compile to GraalVM native, and it can't process Java agents.

I want to compare these three different ways in this post: Java agent v1, Java agent v2, and Micrometer Tracing.

The base application and its infrastructure

I'll use the same base application: a simple Spring Boot application, coded in Kotlin. It offers a single endpoint.

  • The function beyond the endpoint is named entry()
  • It calls another function named intermediate()
  • The latter uses a WebClient instance, the replacement of RestTemplate, to make a call to the above endpoint
  • To avoid infinite looping, I pass a custom request header: if the entry() function finds it, it doesn't proceed further

Sample app sequence diagram

It translates into the following code:

@SpringBootApplication
class Agent1xApplication

@RestController
class MicrometerController {

    private val logger = LoggerFactory.getLogger(MicrometerController::class.java)

    @GetMapping("/{message}")
    fun entry(@PathVariable message: String, @RequestHeader("X-done") done: String?) {
        logger.info("entry: $message")
        if (done == null) intermediate()
    }

    fun intermediate() {
        logger.info("intermediate")
        RestClient.builder()
            .baseUrl("http://localhost:8080/done")
            .build()
            .get()
            .header("X-done", "true")
            .retrieve()
            .toBodilessEntity()
    }
}
Enter fullscreen mode Exit fullscreen mode

For every setup, I'll check two stages: the primary stage, with OpenTelemetry enabled, and a customization stage to create additional internal spans.

Micrometer Tracing

Micrometer Tracing stems from Micrometer, a "vendor-neutral application observability facade".

Micrometer Tracing provides a simple facade for the most popular tracer libraries, letting you instrument your JVM-based application code without vendor lock-in. It is designed to add little to no overhead to your tracing collection activity while maximizing the portability of your tracing effort.

-- Micrometer Tracing site

To start with Micrometer Tracing, one needs to add a few dependencies:

  • Spring Boot Actuator, org.springframework.boot:spring-boot-starter-actuator
  • Micrometer Tracing itself, io.micrometer:micrometer-tracing
  • A "bridge" to the target tracing backend API. In my case, it's OpenTelemetry, hence io.micrometer:micrometer-tracing-bridge-otel
  • A concrete exporter to the backend, io.opentelemetry:opentelemetry-exporter-otlp

We don't need a BOM because versions are already defined in the Spring Boot parent.

Yet, we need two runtime configuration parameters: where should the traces be sent, and what is the component's name. They are governed by the MANAGEMENT_OTLP_TRACING_ENDPOINT and SPRING_APPLICATION_NAME variables.

services:
  jaeger:
    image: jaegertracing/all-in-one:1.55
    environment:
      - COLLECTOR_OTLP_ENABLED=true                                     #1
    ports:
      - "16686:16686"
  micrometer-tracing:
    build:
      dockerfile: Dockerfile-micrometer
    environment:
      MANAGEMENT_OTLP_TRACING_ENDPOINT: http://jaeger:4318/v1/traces    #2
      SPRING_APPLICATION_NAME: micrometer-tracing                       #3
Enter fullscreen mode Exit fullscreen mode
  1. Enable the OpenTelemetry collector for Jaeger
  2. Full URL to the Jaeger OpenTelemetry gRPC endpoint
  3. Set the OpenTelemetry's service name

Here's the result:

Micrometer traces on Jaeger with no customization

Without any customization, Micrometer creates spans when receiving and sending HTTP requests.

The framework needs to inject magic into the RestClient for sending. We must let the former instantiate the latter for that:

@SpringBootApplication
class MicrometerTracingApplication {

    @Bean
    fun restClient(builder: RestClient.Builder) =
        builder.baseUrl("http://localhost:8080/done").build()
}
Enter fullscreen mode Exit fullscreen mode

We can create manual spans in several ways, one via the OpenTelemetry API itself. However, the setup requires a lot of boilerplate code. The most straightforward way is the Micrometer's Observation API. Its main benefit is to use a single API that manages both metrics and traces.

Sample app sequence diagram

Here's the updated code:

class MicrometerController(
    private val restClient: RestClient,
    private val registry: ObservationRegistry
) {

    @GetMapping("/{message}")
    fun entry(@PathVariable message: String, @RequestHeader("X-done") done: String?) {
        logger.info("entry: $message")
        val observation = Observation.start("entry", registry)
        if (done == null) intermediate(observation)
        observation.stop()
    }

    fun intermediate(parent: Observation) {
        logger.info("intermediate")
        val observation = Observation.createNotStarted("intermediate", registry)
            .parentObservation(parent)
            .start()
        restClient.get()
            .header("X-done", "true")
            .retrieve()
            .toBodilessEntity()
        observation.stop()
    }
}
Enter fullscreen mode Exit fullscreen mode

The added observation calls reflect upon the generated traces:

Micrometer traces on Jaeger with the Observation API

OpenTelemetry Agent v1

An alternative to Micrometer Tracing is the generic OpenTelemetry Java Agent. Its main benefit is that it impacts neither the code nor the developers; the agent is a pure runtime-scoped concern.

java -javaagent:opentelemetry-javaagent.jar agent-one-1.0-SNAPSHOT.jar
Enter fullscreen mode Exit fullscreen mode

The agent abides by OpenTelemetry's configuration with environment variables:

services:
  agent-1x:
    build:
      dockerfile: Dockerfile-agent1
    environment:
      OTEL_EXPORTER_OTLP_ENDPOINT: http://jaeger:4317                   #1
      OTEL_RESOURCE_ATTRIBUTES: service.name=agent-1x                   #2
      OTEL_METRICS_EXPORTER: none                                       #3
      OTEL_LOGS_EXPORTER: none                                          #4
    ports:
      - "8081:8080"
Enter fullscreen mode Exit fullscreen mode
  1. Set the protocol, the domain, and the port. The library appends /v1/traces
  2. Set the OpenTelemetry's service name
  3. Export neither the metrics nor the logs

With no more configuration, we get the following traces:

Agent v1 traces on Jaeger with no customization

The agent automatically tracks requests, both received and sent, as well as functions marked with Spring-related annotations. Traces are correctly nested inside each other, according to the call stack. To trace additional functions, we need to add a dependency to our codebase, io.opentelemetry.instrumentation:opentelemetry-instrumentation-annotations. We can now annotate previously untraced functions with the @WithSpan annotation.

@WithSpan class diagram

The value() part governs the trace's label, while the kind translates as a span.kind attribute. If the value is set to an empty string, which is the default, it outputs the function's name. For my purposes, default values are good enough.

@WithSpan
fun intermediate() {
    logger.info("intermediate")
    RestClient.builder()
        .baseUrl("http://localhost:8080/done")
        .build()
        .get()
        .header("X-done", "true")
        .retrieve()
        .toBodilessEntity()
}
Enter fullscreen mode Exit fullscreen mode

It yields the expected new intermediate() trace:

Agent v1 traces on Jaeger with annotations

OpenTelemetry Agent v2

OpenTelemetry released a new major version of the agent in January of this year. I updated my demo with it; traces are now only created when the app receives and sends requests.

Agent v2 traces on Jaeger with no customization

As for the previous version, we can add traces with the @WithSpan annotation. The only difference is that we must also annotate the entry() function. It's not traced by default.

Agent v2 traces on Jaeger with annotations

Discussion

Spring became successful for two reasons: it simplified complex solutions, i.e., EJBs 2, and provided an abstraction layer over competing libraries. Micrometer Tracing started as an abstraction layer over Zipkin and Jaeger, and it made total sense. This argument becomes moot with OpenTelemetry being supported by most libraries across programming languages and trace collectors. The Observation API is still a considerable benefit of Micrometer Tracing, as it uses a single API over Metrics and Traces.

On the Java Agent side, OpenTelemetry configuration is similar across all tech stacks and libraries - environment variables. I was a bit disappointed when I upgraded from v1 to v2, as the new agent is not Spring-aware: Spring-annotated functions are not traced by default. In the end, it's a wise decision. It's much better to be explicit about the spans you want than remove some you don't want to see.

The complete source code for this post can be found on GitHub:

To go further:


Originally published at A Java Geek on August 3rd, 2024

opentelemetry Article's
30 articles in total
Favicon
OpenTelemetry Collector Implementation Guide: Unified Observability for Modern Systems
Favicon
Auto-Instrumentação com OpenTelemetry no EKS [Lab Session]
Favicon
InsightfulAI v0.3.0a1 Update: Railway Oriented Programming and Enhanced OpenTelemetry for Robust Pipelines
Favicon
Using OpenTelemetry with gRPC in Node.js and Express Hybrid Applications
Favicon
Enhancing Observability in Machine Learning with OpenTelemetry: InsightfulAI Update
Favicon
From Zero to Observability: Your first steps sending OpenTelemetry data to an Observability backend
Favicon
Usando stack de monitoria opensource no Kubernetes (sem Prometheus)
Favicon
Observing Spin Apps with OpenTelemetry and the .NET Aspire Dashboard
Favicon
Golang com Opentelemetry, prometheus, Grafana tempo OSS e Grafana padrão
Favicon
Monitor R Applications with an OpenTelemetry Collector
Favicon
Understanding Open telemetry and Observability for SRE
Favicon
How to publish JetBrains Rider plugin for opentelemetry/honeycomb
Favicon
Tracetest Tip: Testing Span Order with Assertions
Favicon
How to publish JetBrains Rider plugin for opentelemetry/honeycomb
Favicon
Monitoring Browser Applications with OpenTelemetry
Favicon
Instrumentação com OpenTelemetry: Zero-Code, Code-Based ou Bibliotecas Instrumentadas?
Favicon
OpenTelemetry: Traces, Métricas, Logs e Baggage
Favicon
Getting Started with OpenTelemetry
Favicon
Explorando a Observabilidade com OpenTelemetry: Propagação de Contexto e Arquiteturas Distribuídas
Favicon
Observability with ASP.NET Core using OpenTelemetry, Prometheus and Grafana
Favicon
Trace-Based Tests with GraphQL in Action!
Favicon
Wednesday Links - Edition 2024-08-07
Favicon
Implementing an Order Processing System: Part 5 - Distributed Tracing and Logging
Favicon
Tracetest Monitors: Synthetic Monitoring with OpenTelemetry and Playwright
Favicon
Unlocking Open Source Observability: OpenTelemetry, Prometheus, Thanos, Grafana, Jaeger, and OpenSearch
Favicon
Announcing Tracetest Enterprise On-Prem Solution
Favicon
OpenTelemetry with Elastic Observability
Favicon
Performans Ve Güvenilirlik Ölçekleri
Favicon
OpenTelemetry Metrics meets Azure
Favicon
OpenTelemetry Tracing on Spring Boot, Java Agent vs. Micrometer Tracing

Featured ones: