Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
Dimitrios Klimis

@dimi-nk says it is async, but doesn't copy the properties defined into asciidoc

Thanks @adriancole , that clears is up

Hello everyone!
I am using kafka streams alongside sleuth to enable tracing intra and inter service.
Right now, I am setting the setUncaughtExceptionHandler on StreamsBuilderFactoryBean class. This will allows to catch expected and unexpected exceptions that occur inside Streams Api.
So far so good, when an exception is thrown, the callback method that is received by setUncaughtExceptionHandler setter is called as expected.
However I am trying to produce logs inside callback method and traceId is not preserved. For example when I call inside the callback method tracer.currentSpan() it returns null.
Dos this happens because callback method is running on a separate thread? If yes, what is the best way to pass the Tracer context to a new thread?
9 replies
Marcin Grzejszczak
it might be the case. You'd need to instrument whatever is spawning that thread
Is there any way of checking what is spawning that thread, I really don't know where to look...
Any Intellij tool, profile, etc
Templeton Peck
Sorry another Sleuth/Zipkin/X-Ray question. Even on normal MVC calls it isn't setting the http.url is that right? Here is an example:
    "Duration": 0.003,
    "Id": "1-5edfc25c-29fa208298d8dcaabe39b7ca",
    "Segments": [
            "Document": {
                "id": "653dc59555a64e76",
                "name": "alt-service",
                "start_time": 1591722588.818313,
                "trace_id": "1-5edfc25c-29fa208298d8dcaabe39b7ca",
                "end_time": 1591722588.821566,
                "http": {
                    "request": {
                        "method": "GET"
                "annotations": {
                    "mvc_controller_class": "AltController",
                    "mvc_controller_method": "query",
                    "http_path": "/v1/",
                    "operation": "get /v1/"
            "Id": "653dc59555a64e76"
Jorge Tovar
Hello , How can I add sleuth/zipkin instrumentation to retrofit (Http client) used between many spring boot microservices... ? Can you help me please (Y)
With RestTemplate @Bean it works nice but we have Retrofit as our default http client
I can see in this post that there is not support :S
Adrian Cole
@jorgetovar_gitlab retrofit is built over okhttp, and okhttp is supported by brave
3 replies
answered your stackoverflow
Adrian Cole
@LtTempletonPeck the default parsing doesn't include the HTTP url for security and performance reasons. there's an example to add more (for any library not just mvc) here
1 reply
Jorge Tovar
Ok I will look that. In short use plain brave to add all the headers logic. Right?
Adrian Cole
Marcin Grzejszczak
@ricardoneves93 what I do is I debug the internals of a given library to understand how it works and how those threads are spawned. Maybe there's a better way but I don't know it ;)
Jorge Tovar
I used this bean to create the retrofit bean with tracing in okhttp3 but it doesn't add the headers on the http call ... what can be the error?
` @Bean("retrofitTracing")
public Retrofit retrofitTracing(HttpTracing tracing) {
    OkHttpClient.Builder httpClient = new Builder();
    HttpLoggingInterceptor logging = new HttpLoggingInterceptor();
    // httpClient.addInterceptor(new RetrofitSleuthInterceptor(tracer)); 
   // (with custom headers it works)
    Retrofit.Builder builder = new Retrofit.Builder();
    OkHttpClient client =;
    return builder.callFactory(TracingCallFactory.create(tracing, client))
10 replies
I ended up using a dynamic proxy and a bean post processor to add tracing support for reactive mongo repositories. Is this something that’s of interest to support as part of sleuth itself? Just wondering if this can be supported via auto configuration. Is there a better way?
Hi Marcin , we are using Spring Webflux in our API and using webclient to invoke a remote service. springCloudVersion is "Hoxton.RELEASE" . I see that the logs related to reactor threads are missing the MDC context information but the X-B3 headers are still in the payload. To fix this we are setting the MDC context from X-B3 headers in our custom logger attached to jettyHttpClient and then we can see the logs having the MDC information. Does this mean sleuth is setting the X-B3 headers in reactor threads but missing to add MDC context to those threads as part of TraceWebFilter implementation.
I just want to be sure we are not doing something that sleuth already does .
Adrian Cole
@jorgetovar_gitlab here's what I posted to your stackoverflow

you can configure brave's Call.Factory with retrofit with an HttpTracing bean configured by Sleuth and an instance of OkHttp

retrofitBuilder.callFactory(TracingCallFactory.create(httpTracing, okhttp))

no reason to do anything otherwise I think
@krraghavan openzipkin/brave#1113
Hi @adriancole @marcingrzejszczak , can you please let me know if i have to raise a stackoverflow before posting my question here . I posted earlier regarding sleuth adding X-B3 headers but not adding MDC context to reactor threads. And so we had to exclusively handle in our logger to grab the info from X-B3 headers and set MDC context on reactor thread . Can you please let me know if this is right ?
58 replies
Hi @adriancole , please let me know if you need the gradle project . Thanks much , Malika
I posted the sample code in the Thread you started
@adriancole thanks for the link. The solution I had actually works with the Spring repository abstraction rather than the driver level. I think there’s room for both although the driver one is likely only useful for deeper troubleshooting. I wrap the repository classes in a dynamic proxy and create a new child span in the handler. Feels like there’s room for both
Good afternoon,
I'm facing an issue using sleuth in my application and calling an external java api wich do some calculation and return a CompletableFuture<T>.
Sleuth fields (traceId...) are not automatically injected into MDC for all logs in that CompletableFuture calculation part.
I checked the documentation, but the CompletableFuture is created and executed outside my application
Is there any solution so i can make the complete ecosystem aware of MDC of my application (wich contain sleuth fields)?
5 replies
Jorge Tovar
hello, the sleuth instrumentation is working perfect with my http clients but I need to trace sqs events, I add the dependencies and all the required beans but I dont see headers of trace in the logs
What I need to do because I see in all the post that this should work out of the box
2 replies
    public QueueMessageHandler queueMessageHandler() {
        QueueMessageHandlerFactory queueMsgHandlerFactory = new QueueMessageHandlerFactory();
        QueueMessageHandler queueMessageHandler = queueMsgHandlerFactory.createQueueMessageHandler();
        List<HandlerMethodArgumentResolver> list = new ArrayList<>();
        list.add(new HeadersMethodArgumentResolver());
        list.add(new PayloadArgumentResolver(new MappingJackson2MessageConverter()));
        return queueMessageHandler;
Jorge Tovar
Another question @adriancole how can I get an span from string "05e36a8d977eee59" in sleuth
14 replies
@here I want to trace the kafka messages received from kafka topic using Spring Cloud Sleuth
Is there any best blog post or any documentation I can refer to
2020-06-13 00:17:24.523  INFO 17274 --- [ntainer#0-0-C-1] c.s.w.listeners.KafkaListeners   : Message Received from Kafka topic TEST: Hello
2020-06-13 00:17:24.523  INFO 17274 --- [ntainer#0-0-C-1] com.test.arun.util.ServicebusQueueImpl    : Sending Message to Service Bus: Hello
2020-06-13 00:17:24.600  INFO 17274 --- [ntainer#0-0-C-1] c.s.w.listeners.KafkaListeners   : ServiceBus Test Topic: Hello
2020-06-13 00:17:42.220  INFO 17274 --- [ntainer#0-0-C-1] o.s.s.c.ThreadPoolTaskScheduler          : Shutting down ExecutorService
I added the below dependency
I don't see any traces or any spans added to the each message
sraka kaka
Everything works fine locally, the problem is when I run applications in the container

does anyone know what the problem may be?

my docker compose file

version: '3'
    image: openzipkin/demos
      - 8083:8083
      - zipkin
      - zipkin
    image: openzipkin/zipkin
      - 9411:9411

i'm getting

2020-06-13 20:59:53.322  WARN [tester,,,] 1 --- [           main] o.s.c.s.zipkin2.ZipkinAutoConfiguration  : Check result of the [] contains an error [CheckResult{ok=false, error=org.springframework.web.client.ResourceAccessException: I/O error on POST request for "http://localhost:9411/api/v2/spans": Connection refused (Connection refused); nested exception is Connection refused (Connection refused)}]

registry_1  | org.springframework.web.client.ResourceAccessException: I/O error on POST request for "http://localhost:9411/api/v2/spans": Connection refused (Connection refused); nested exception is Connection refused (Connection refused)


I'm trying to solve it for several days
Adrian Cole
also you can use "container_name" I think and not need links.. ex set "container_name" to zipkin and then just use the hostname zipkin in your app config
Marcin Grzejszczak
@krraghavan can you please file an issue to add support for retrofit?
@malika15 can you upgrade to the latest version of sleuth and to the latest release train?
I have a question regarding baggage, headers and MDC, I'd appreciate it if you'll have a look:
Marcin Grzejszczak
I answered 23 minutes ago