by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Adrian Cole
@adriancole
what @marcingrzejszczak said is true also, that it would get hairy to implement your own profiler.. which is why I asked if it is purely important
ex many use log correlation and while there are flaws in that, often the method or class are evidednt in logging format
anyway these are some good context for you, so best to process that and think about it
Invoker-liu
@Invoker-liu
@adriancole thank you,After look over docs,it means the client must be a bean,that's sad,i will try other
Adrian Cole
@adriancole
@Invoker-liu just trying to be honest with you
I worked on the web client stuff recently reactor web etc
definitely try 2.2.3 snapshot first to withhold judgement but it might be the same. if there's a trace break, just open an issue, it can be solved or at least explained.
MelvinStephen
@MelvinStephen
sure, let me try exploring apache skywalking
Adrian Cole
@adriancole
cool iirc you can collaborate sleuth and skywalking as skywalking also can accept zipkin format
they are smart and earnest, like us I hope :D
Invoker-liu
@Invoker-liu
@adriancole Thanks!I will consider adding issue to github
Adrian Cole
@adriancole
:thumbsup:
Ghenadii Batalski
@ghenadiibatalski
hi, could one give me please a hint about reactor kafka instrumentation
Marcin Grzejszczak
@marcingrzejszczak
It doesn't work out of the box?
Ghenadii Batalski
@ghenadiibatalski
@marcingrzejszczak sorry, was afk, i don't know how it should work out of the box. Is there any example or even a (detailed) docu ? For webflux and webclient it does work, but not the reactive kafka. The way i'm creating a sender/receiver could be a problem, if it should work
Ghenadii Batalski
@ghenadiibatalski

@marcingrzejszczak this is my sample code

       // sender
        sender = KafkaSender.create(senderOptions);
        processor = DirectProcessor.<Tuple2<String,String>>create().serialize();
        sink = processor.sink();
        sender.send(processor.map(objects ->  SenderRecord.create(new ProducerRecord<>("topic1", objects.getT1(), objects.getT2()),objects.getT2())))
                .doOnNext(senderResult -> log.info(senderResult.correlationMetadata()))
                .subscribe();
       // receiver
        receiver = KafkaReceiver.create(receiverOptions);
        receiver
                .receive()
                .doOnNext(receiverRecord -> log.info("key: {}, value: {}", receiverRecord.key(), stringStringReceiverRecord.value()))
                .subscribe();

send on rest call

    sink.next(Tuples.of("key", ""+System.nanoTime()));

log:

   2020-05-29 00:07:43.063  INFO [demo,,,] 31028 --- [       single-1] com.example.demo.DemoApplication         : 88843267100400
   2020-05-29 00:07:43.064  INFO [demo,,,] 31028 --- [     parallel-2] com.example.demo.DemoApplication         : key: key, value: 88843267100400
Adrian Cole
@adriancole
@ghenadiibatalski maybe thumbsup this or comment with your example for now? reactor/reactor-kafka#128
Marcin Grzejszczak
@marcingrzejszczak
I'm not an expert with reactor kafka but shouldn't you use beans instead of static methods?
Ucky.Kit
@uckyk
image.png
Adrian Cole
@adriancole
@uckyk looks like context leak (same trace ID stuck on a thread)
so depending on what's used and what is in the classpath might explain
Ucky.Kit
@uckyk
image.png
Adrian Cole
@adriancole
ok that is sadly a dead end version
Ucky.Kit
@uckyk
Because the project is relatively old, the introduced version of spring cloud is relatively low
Adrian Cole
@adriancole
even 2.0 2.1 aren't maintained anymore
this version used a way for "scoping" which was not try/finally in nature
and so had a higher chance of leaks vs the 2.x line which use brave's CurrentTraceContext api
Ucky.Kit
@uckyk
I have observed that the 2. X version basically uses the wave related POM
Adrian Cole
@adriancole
yeah the custom tracing api from 1.x is no longer around
Ucky.Kit
@uckyk

So the current solution is that I upgrade the access version better, right?

Because there are fewer context leaks in higher versions

Adrian Cole
@adriancole
it is a completely different api which was designed to make less leaks
there are still some leak problems in sleuth mostly in reactor
but for webmvc normal should be none
there's always chance for leak
main thing is less chance, the try/finally api makes it easy to see when people make leaky stuff
ex ever see something should be used in try/finally and carelessly doesn't close
if api isn't meant to be try/finally it is very hard to see where leak occurs
this is the technical reason why 2.x should be less leaks and generally is
Ucky.Kit
@uckyk
OK, I'll try to upgrade version 2. X and see if there are similar problems. Thank you very much!
Adrian Cole
@adriancole
sure 2.2.3 is out now, so try (it works with spring boot 2.2 and 2.3)
Ucky.Kit
@uckyk
got it,
thx!!!
Adrian Cole
@adriancole
np!
Marcin Grzejszczak
@marcingrzejszczak
with 1.3.x we had our own tracer implementation, in 2.x we've migrated to Brave and lots of bugs was fixed since then so absolutely we advise to migrate to the latest stable version
Veera93
@Veera93
Hi All,
I am relatively new to sleuth framework.. I have an usecase where traceId needs to be sent back in the response header. I was hoping to achieve the same using annotations or configuration. When I had searched they were achieved by adding filters (also found few old threads where they had mentioned that sending traceId in the response header has been removed due to potential security concern). In the latest implementation are there any configuration/annotations for the same or I would have to write a trace filter? Any help on this is appreciated
Adrian Cole
@adriancole
@Veera93 still the same
the tracing system does not mutate the response at the moment, so it can't write response headers
Saisurya Kattamuri
@saisuryakat
Hi
What's the difference between spring.sleuth.remote-fields vs spring.sleuth.baggage.remote-fields
In the sleuth properties appendix its given with baggage. whereas in the documentation without
https://cloud.spring.io/spring-cloud-sleuth/reference/html/appendix.html
image.png
Which of these is correct