Hello,
I am having Context propagation issues when switching between Cats Effect IO and Scala Futures with the newer version (2.2.0). I don't see issues with Cats Effect IO + Default Scala global execution context.
When using Akka Dispatchers I seem to be losing context. Sharing minimal example https://gist.github.com/ndchandar/0c54f348a72308d3abb1741f311c650c
Appreciate your help on this.
Hi, I recently started using kamon-prometheus and noticed that a counter metric always yielded 0 when querying it via the increase or rate function of PromQL. The reason for this was that the corresponding Kamon counter has been initialised just right before incrementing it. Therefore, no initial counter value of 0 has been exported. So I did some research and stumbled across this part of the Prometheus documentation: https://prometheus.io/docs/practices/instrumentation/#avoid-missing-metrics
It recommends to initialise all metrics before using them. I'd like to do this but it seems very tedious/unrealistic to do it manually by calling every metric with every possible label combination at the start of my application. So, I wonder, is there some utility or configuration for kamon-prometheus that initializes all the metrics (or rather series) automatically so that initial values are exported?
Thx in advance!
competitions-service-54cf8bf698-kb2pv competitions-service [application-akka.kafka.default-dispatcher-19] ERROR 2021-06-22 23:55:05 Logger : Error => org.apache.kafka.clients.consumer.KafkaConsumer with message Cannot locate field named groupId for class org.apache.kafka.clients.consumer.KafkaConsumer. Class loader: jdk.internal.loader.ClassLoaders$AppClassLoader@9e89d68: java.lang.IllegalStateException: Cannot locate field named groupId for class org.apache.kafka.clients.consumer.KafkaConsumer
class com.mysql.jdbc.StatementImpl cannot be cast to class kamon.instrumentation.jdbc.HasDatabaseTags (com.mysql.jdbc.StatementImpl and kamon.instrumentation.jdbc.HasDatabaseTags are in unnamed module of loader 'app')
Hey guys,
we tried kamon-bundle 2.2.0 with scala/ guice/kafka application with proper tracing enabled in logback and also included JavaAgent
.enablePlugins(PlayScala, JavaAgent, JavaAppPackaging)
in build.sbt
but our trace/span sporadically appears/disappears for the application.
[warn][2021-07-01_14:04:07.083] [undefined|undefined] o.a.k.c.NetworkClient
any pointer people what we might be missing here
Http().singleRequest()
but it's not passing traceid for connection level api Http(system).outgoingConnection()
streaming api. Any idea how to pass the traceids using kamon for streaming connection level api?
Failed to record value [8784530422484846] on [span.elapsed-time,{operation=aws-s3.putObject,span.kind=internal,parentOperation=/v4/apiendpoint,error=false,component=scala.future}] because the value is outside of the configured range. The recorded value was adjusted to the highest trackable value [3600000000000]. You might need to change your dynamic range configuration for this metric
jvm.memory.used
. I use influx and datadog reporter (via agent). JVM Metric is available in influx, I can query in grafana but when I query it in datadog my service does not appear. Many other metrics work fine but I do not understand what is the case with this one. Can somebody help me how can figure out what is going on here? I try to migrate to datadog from influx/grafana. thank you
"kamon-play-2.6" % "1.1.3"
to kamon 2, so I have a "kamon-bundle" % "2.2.2"
dependency as well as a addSbtPlugin("io.kamon" % "sbt-kanela-runner-play-2.8" % "2.0.9")
sbt plugin.java.lang.NoSuchMethodError: kamon.Kamon$.withContext
any ideas how I can fix this?
2021-07-27 13:42:37 ERROR ModuleRegistry:218 - Reporter [DatadogSpanReporter] failed to process a spans tick.
java.io.EOFException: \n not found: limit=0 content=…
at okio.RealBufferedSource.readUtf8LineStrict(RealBufferedSource.kt:332) ~[okio-jvm-2.8.0.jar:?]
at okhttp3.internal.http1.HeadersReader.readLine(HeadersReader.kt:29) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http1.Http1ExchangeCodec.readResponseHeaders(Http1ExchangeCodec.kt:178) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.connection.Exchange.readResponseHeaders(Exchange.kt:106) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.kt:79) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.9.0.jar:?]
at kamon.okhttp3.instrumentation.KamonTracingInterceptor.intercept(KamonTracingInterceptor.scala:27) ~[kamon-bundle_2.13-2.2.0.jar:2.2.0]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:34) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) ~[okhttp-4.9.0.jar:?]
at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154) ~[okhttp-4.9.0.jar:?]
at kamon.datadog.package$HttpClient.$anonfun$doRequest$1(package.scala:57) ~[kamon-datadog_2.13-2.2.0.jar:2.2.0]
at scala.util.Try$.apply(Try.scala:210) ~[scala-library-2.13.3.jar:?]
at kamon.datadog.package$HttpClient.doRequest(package.scala:57) ~[kamon-datadog_2.13-2.2.0.jar:2.2.0]
at kamon.datadog.package$HttpClient.doMethodWithBody(package.scala:65) ~[kamon-datadog_2.13-2.2.0.jar:2.2.0]
at kamon.datadog.package$HttpClient.doPut(package.scala:86) ~[kamon-datadog_2.13-2.2.0.jar:2.2.0]
at kamon.datadog.package$HttpClient.doJsonPut(package.scala:96) ~[kamon-datadog_2.13-2.2.0.jar:2.2.0]
at kamon.datadog.DatadogSpanReporter.reportSpans(DatadogSpanReporter.scala:116) ~[kamon-datadog_2.13-2.2.0.jar:2.2.0]
at kamon.module.ModuleRegistry.$anonfun$scheduleSpansBatch$1(ModuleRegistry.scala:217) ~[kamon-core_2.13-2.2.0.jar:2.2.0]
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) ~[scala-library-2.13.3.jar:?]
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:671) ~[scala-library-2.13.3.jar:?]
at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:430) [scala-library-2.13.3.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
Exception in thread "main" java.lang.ClassCastException: class ch.qos.logback.classic.spi.LoggingEvent cannot be cast to class kamon.instrumentation.context.HasContext (ch.qos.logback.classic.spi.LoggingEvent and kamon.instrumentation.context.HasContext are in unnamed module of loader 'app')81ca0ad52ddf4b6f837db7a502939ed0
Exception in thread "main" java.lang.ClassCastException: class ch.qos.logback.classic.spi.LoggingEvent cannot be cast to class kamon.instrumentation.context.HasContext (ch.qos.logback.classic.spi.LoggingEvent and kamon.instrumentation.context.HasContext are in unnamed module of loader 'app')
@/all hey folks, this is a reminder that we are migrating to Discord for questions and chat related to Kamon. You can join our Discord here: https://discord.gg/5JuYsDJ7au
Have a great week!
i cannot find any metrics exposed via kamon-prometheus
application.conf:
kamon.prometheus {
include-environment-tags = true
embedded-server {
hostname = 0.0.0.0
port = 9404
}
}
implementation:
class SinkConnector() extends org.apache.kafka.connect.sink.SinkConnector {
val underlying: AerospikeSinkConnector = new AerospikeSinkConnector()
override def start(map: util.Map[String, String]): Unit = {
Kamon.init()
Kamon.counter("testing-kamon").withoutTags().increment()
try {
underlying.start(map)
}
catch {
case ex: Throwable =>
println(s"Failure on underlying.start($map)")
Kamon.counter("underlying-start-connector-failure").withTag("config-file",configFile).withTag("message", ex.getMessage).increment()
throw ex
}
finally {
Kamon.stopModules()
}
}
dependencies:
"io.kamon" %% "kamon-prometheus" % "2.2.2" exclude("org.slf4j", "slf4j-api"),
"io.kamon" %% "kamon-core" % "2.1.0" exclude("org.slf4j", "slf4j-api")
I already have JMX Exporter which expose Kafka metrics to 9404
, i tried to make Kamon use this port also, when i remove the application.conf and use the default value of port 9095
i cannot port-forward to this port for some reason.
I'm missing something?
Thanks!
2021-08-31 12:03:52,224 WARN Failed to attach the instrumentation because the Kamon Bundle is not present on the classpath (kamon.Init) [connector-thread-dashboard-connector-profile]
when i'm using "io.kamon" %% "kamon-prometheus" % "2.2.2" exclude("org.slf4j", "*")
- any ideas why?
Hi. We are experiencing the following WARN message:
Failed to record value [-401488] on [span.processing-time,{operation=serialize,error=false}] because the value is outside of the configured range. The recorded value was adjusted to the highest trackable value [3600000000000]. You might need to change your dynamic range configuration for this metric
So the recorded value is negative. What we use is the Kamon SpanBuilder.start(Instant)
, however the span is later (within sub-milliseconds) finished via Span.finish()
(where the underlying Clock
is used to determine the nanos of the finish time)
Could it be that this "mixing" can cause negative values being recorded?
Caffeine.newBuilder().recordStats(() -> new KamonStatsCounter("cache_name")).build();
but not sure i understand what is needed to be passed to the recordStats. i see it expects to get a supplier but this example isnt working so i probably miss something.
Caused by: java.lang.VerifyError: Expecting a stackmap frame at branch target 102
. The application loader looks like thisclass CustomApplicationLoader extends GuiceApplicationLoader {
override protected def builder(context: Context): GuiceApplicationBuilder =
super
.builder(context)
.eagerlyLoaded()
I am trying to add traceability support to a play 2.8 application with Kamon and Jaeger. I followed [instructions here] (https://kamon.io/docs/latest/reporters/jaeger/) . I am able to see the startup logs for Kanela agent as well as the Jaeger reportes as follows
[info] Running the application with the Kanela agent
_ __ _ ______
| |/ / | | \ \ \ \
| ' / __ _ _ __ ___| | __ _ \ \ \ \
| < / _` | '_ \ / _ \ |/ _` | ) ) ) )
| . \ (_| | | | | __/ | (_| | / / / /
|_|\_\__,_|_| |_|\___|_|\__,_| /_/_/_/
==============================
Running with Kanela, the Kamon Instrumentation Agent :: (v1.0.8)
--- (Running the application, auto-reloading is enabled) ---
[info] p.c.s.AkkaHttpServer - Listening for HTTP on /0:0:0:0:0:0:0:0:9001
(Server started, use Enter to stop and go back to the console...)
2021-09-23 13:29:11,210 [info] [play-dev-mode-akka.actor.default-dispatcher-11] k.i.p.GuiceModule$KamonLoader - Reconfiguring Kamon with Play's Config
2021-09-23 13:29:11,211 [info] [play-dev-mode-akka.actor.default-dispatcher-11] k.i.p.GuiceModule$KamonLoader - play.core.server.AkkaHttpServerProvider
2021-09-23 13:29:11,213 [info] [play-dev-mode-akka.actor.default-dispatcher-11] k.i.p.GuiceModule$KamonLoader - 10 seconds
2021-09-23 13:29:11,573 [info] [play-dev-mode-akka.actor.default-dispatcher-11] k.j.JaegerReporter - Started the Kamon Jaeger reporter
Jaeger is started through a docker container with following command:
docker run -d --name jaeger1 -e COLLECTOR_ZIPKIN_HOST_PORT=:9411 -p 5775:5775/udp -p 6831:6831/udp -p 6832:6832/udp -p 5778:5778 -p 16686:16686 -p 14268:14268 -p 14250:14250 -p 9411:9411 jaegertracing/all-in-one:1.25
None of the traces are visible when I try to access the APIs for my play application. Any configuration I am missing here?