io.grpc.stub.AbstractStub
I would say you can do the instrumentation yourself in io.grpc.CallCredentials#applyRequestMetadata
by passing it to callOptions
jvm_gc_seconds_sum{collector="g1-young",component="jvm",country="xxx",customer="xxx",datacenter="dc1",environment="prod",generation="young",instance="xxxx:31882",job="xxx-api-metrics",node="host3"}
Getting some really weird results out of the akka metrics. So I decided to do a really simple Ping/Pong app with two static actors and one that gets created and killed every 5 seconds.
The metrics for the active actor count just keeps increasing which makes no senseakka_system_active_actors_count{system="area51-kamon-akka"} 2865.0
Same goes go for queued messages which keeps increasing but as I only send a Ping and a Pong as response it is impossible to have a queue/inbox growing
The PingerPonger code example
I can't see what I'm doing wrong...pointers would be appreciated
Using latest version of Kamon 2.x, Java 11 OpenJDK and Scala 2.13
docker run -d -p 9411:9411 openzipkin/zipkin
-javaagent:/..full..path../kanela-agent-1.0.6.jar
Task :run FAILED
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.Seq$.empty()Lscala/collection/GenTraversable;
at kamon.Configuration.$init$(Configuration.scala:28)
at kamon.Kamon$.<init>(Kamon.scala:19)
at kamon.Kamon$.<clinit>(Kamon.scala)
at kamon.Kamon.init(Kamon.scala)
at com.example.AkkaQuickstart.main(AkkaQuickstart.java:9)
FAILURE: Build failed with an exception.
I would appreciate some help, not sure what else to look at.
description
and units
just like you have in that Datadog metadata. But at the end, using the DogStatsD reporter that information seems is not sent to Datadog (indeed seems its not possible to send it through that interface). But looking at the Kamon Reporter for Datadog HTTP API in source code, seems the code is not prepared to handle that description and unit information. That would be great as it will allow to define in one place the information for a Datadog Metric (Description and units). Can anyone confirm this or am I missing something?
Hi,
I was attempting to integrate Kamon with a Lagom application. Following the steps for a play based app, I'm getting this compile error:
_ __ _ ______
| |/ / | | \ \ \ \
| ' / __ _ _ __ ___| | __ _ \ \ \ \
| < / _` | '_ \ / _ \ |/ _` | ) ) ) )
| . \ (_| | | | | __/ | (_| | / / / /
|_|\_\__,_|_| |_|\___|_|\__,_| /_/_/_/
==============================
Running with Kanela, the Kamon Instrumentation Agent :: (v1.0.5)
[error] (run-main-0) java.lang.NoClassDefFoundError: java/sql/Date
[error] java.lang.NoClassDefFoundError: java/sql/Date
I'm presuming this is because I'm using openjdk 11. I've read, as a general workaround for the above, that you can add something like --add-modules java.sql,java.xml.bind
to your java opts. But I'm guessing I need to add these for the kanela run itself (adding them to the sbt javaOptions
doesn't seem to help). Is that possible, and/or anyone know how to do that?
Thanks,
Mike
Hi guys,
I followed the basic setup for Play Framework. Everything seems to work except that I'm not seeing any traces at all:
I'm using:
// project/plugins.sbt
addSbtPlugin("io.kamon" % "sbt-kanela-runner-play-2.8" % "2.0.6")
// build.sbt dependencies:
"io.kamon" %% "kamon-bundle" % "2.1.0",
"io.kamon" %% "kamon-apm-reporter" % "2.1.0"
// application.conf
kamon {
environment.service = "myService"
apm.api-key = "xxxxxxxxxxxxxxxxx"
}
Any idea what could be the reason?
I made a few hundred requests to make sure it's not due to sampling.
Some more info on the message above:
In the status page http://localhost:5266/#/ I see:
so it looks to me like traces are being sent. Why can't I see them in Kamon APM?
"Running with Kanela, the Kamon Instrumentation Agent"
for both nodes.akka.remote.artery.Deserializer:90 - Failed to deserialize message from [unknown] with serializer id [17] and manifest [d]. akka.protobufv3.internal.InvalidProtocolBufferException: Protocol message contained an invalid tag (zero).
IO
chains within the service, even when they cross thread boundaries
Kamon.init()
in my main object and the dependency into the build.sbt
Unfortunately, when I deploy the docker image into the k8s cluster, I see that the instrumentation is not started: Instrumentation
Disabled, Reporters 1 Started, Metrics 49 Metrics
. Could you please give me a hint of what I'm missing?
implementation 'io.kamon:kamon-bundle_2.13:2.1.10'
implementation 'io.kamon:kamon-zipkin_2.13:2.1.10'
implementation 'io.kamon:kamon-akka_2.13:2.1.10'
implementation 'io.kamon:kamon-akka-http_2.13:2.1.10'
implementation 'io.kamon:kamon-annotation_2.13:2.1.10'
and the config is kanela modules.annotation.within = ["^com..*"]
val kamonPrometheusReporter = PrometheusReporter()
Kamon.registerModule("prometheus-reporter", kamonPrometheusReporter)
Kamon.init()
. However the moment am trying to bring up the app in my docker which is using amazon correto JRE, am not able to see akka metrics available at my prometheus server end point. I tried to bring up using kanela agent as well, version 1.0.4
java -javaagent:kanela-agent-1.0.4.jar, any pointers as to what could have gone wrong?