SimunKaracic on master
added support for opentelemetry… Merge pull request #977 from pn… (compare)
SimunKaracic on master
fix for w3c propagation of the … Merge pull request #988 from pn… (compare)
Hi there,
having trouble with the PrometheusPushgatewayReporter
. I've setup a Scala sbt project with following versions:
val kamonApmReporter = "2.1.12"
val kamonBundle = "2.1.12"
val kamonPrometheus = "2.1.12"
val kanelaAgent = "1.0.7"
libraryDependencies ++= Seq(
"io.kamon" %% "kamon-apm-reporter" % kamonPrometheus,
"io.kamon" %% "kamon-bundle" % kamonBundle,
"io.kamon" %% "kamon-prometheus" % kamonPrometheus
)
javaAgents += "io.kamon" % "kanela-agent" % kanelaAgent
Config looks like this:
kamon {
environment {
tags {
app = "my-scala-job"
}
}
modules {
status-page {
enabled = false
}
apm-reporter {
enabled = false
}
host-metrics {
enabled = false
}
prometheus-reporter {
enabled = false
}
pushgateway-reporter {
# activate pushgateway-reporter
enabled = true
}
}
prometheus {
include-environment-tags = yes
embedded-server.port = 4001
#
# Settings relevant to the PrometheusPushgatewayReporter
#
pushgateway {
api-base-url = "http://localhost:9091/metrics"
api-url = ${kamon.prometheus.pushgateway.api-base-url}"/job/my-scala-job"
}
}
}
kanela {
show-banner = false
}
I do have a local docker container running for prom/pushgateway:v1.4.0
. It is listing the metric, but only with the default push_time_seconds
and push_failure_time_seconds
gauges. No other metrics.
If I do
echo "mymetric 99" | curl --data-binary @- http://localhost:9091/metrics/job/my-push-job
it is displayed in pushgateway's UI. So, I don't expect the issue to be on pushgateways side.
If I activate the "normal" PrometheusReporter
it shows the expected metrics, like akka etc.
Any ideas, where the issue could be?
Thx in advance.
host-metrics
module on the latest adoptopenjdk/openjdk8-openj9
alpine slim image (jdk8u282-b08_openj9-0.24.0-alpine-slim
).jdk8u275-b01_openj9-0.23.0-alpine-slim
) then everything appears to be fine. _ __ _ ______
| |/ / | | \ \ \ \
| ' / __ _ _ __ ___| | __ _ \ \ \ \
| < / _` | '_ \ / _ \ |/ _` | ) ) ) )
| . \ (_| | | | | __/ | (_| | / / / /
|_|\_\__,_|_| |_|\___|_|\__,_| /_/_/_/
==============================
Running with Kanela, the Kamon Instrumentation Agent :: (v1.0.6)
11:17:28.309 [main] INFO kamon.status.page.StatusPage - Status page started on http://0.0.0.0:5266/ -
Unhandled exception
Type=Segmentation error vmState=0x00040000
J9Generic_Signal_Number=00000018 Signal_Number=0000000b Error_Value=00000000 Signal_Code=00000080
Handler1=00007FC2C2C12DA0 Handler2=00007FC2C24F3020 InaccessibleAddress=0000000000000000
RDI=00007FC2817D85A0 RSI=00007FC2C3F31370 RAX=34F543EFC42A4CC0 RBX=00007FC2780139D0
RCX=00007FC278013AC0 RDX=0000000000000000 R8=00007FC2C406FF60 R9=00007FC27803C418
R10=00007FC2817DC9E0 R11=00007FC2A80F78C9 R12=00007FC2817D8648 R13=00007FC2A8105000
R14=0000000000000001 R15=00007FC2817DCBB0
RIP=00007FC2C3F31980 GS=0000 FS=0000 RSP=00007FC2817D85A0
EFlags=0000000000010206 CS=0033 RBP=0000000000000001 ERR=0000000000000000
TRAPNO=000000000000000D OLDMASK=0000000000000000 CR2=0000000000000000
xmm0 0000000000000000 (f: 0.000000, d: 0.000000e+00)
xmm1 00ff000000000000 (f: 0.000000, d: 7.063274e-304)
xmm2 65677261742f3074 (f: 1949249664.000000, d: 3.040402e+180)
xmm3 2f30303a30303030 (f: 808464448.000000, d: 2.133265e-81)
hi, I'm unable to compile the project locally because i'm getting
[error] lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] file:/home/b.wiercinski/.m2/repository/io/netty/netty-transport-native-epoll/4.1.50.Final/netty-transport-native-epoll-4.1.50.Final-linux-x86_64.jar: not found: /home/b.wiercinski/.m2/repository/io/netty/netty-transport-native-epoll/4.1.50.Final/netty-transport-native-epoll-4.1.50.Final-linux-x86_64.jar
any hints?
Hi , I am working with Kamon Zipkin to trace the request. The request has
1) call to database that returns Result as a Monix Task say Task[T]
2) I am using that result to make callls to a different webservice which is of type Future[HttpResponse]
3) Using result from 2, I am making another Database call
Now before step 1 trace_id is present, and it gets lost after step 1 and nothing after gets traced .
Now If I replace 1) by a static list of records instead of a DB call and then tracing happens successfully
"io.kamon" %% "kamon-core" % “2.1.9”,
"io.kamon" %% "kamon-scala-future" % “2.1.9"
"io.kamon" %% "kamon-executors" % “2.1.9”,
"io.kamon" %% "kamon-zipkin" % “2.1.9"
"io.kamon" %% "kamon-logback" % “2.1.9"
Is there any known issue with Monix tasks w.r.t tracing ???
Thanks in advance
sbt-assembly
and shading rules, I'm not sure if it would be easy (or possible), but it would be nice to have!
greetings everyone! i'm using Kamon 2.1.3 (kamon-bundle and kamon-datadog dependencies) with Play 2.7.3, I see that Kamon starts up successfully and it records metrics that were created by me, but it doesn't retrieve any span metrics - even though the Datadog span reporter is turned on. I don't see them either on Datadog or Kamon status page.
The span metrics stopped being recorded once I migrated from Kamon 1.x to 2 - it used to work just fine back then. is there anything I should add inside code besides the configuration in application.conf? thanks!
[error] a.a.RepointableActorRef - Error in stage [kamon.instrumentation.akka.http.ServerFlowWrapper$$anon$1$$anon$2]: requirement failed: HTTP/1.0 responses must not have a chunked entity
java.lang.IllegalArgumentException: requirement failed: HTTP/1.0 responses must not have a chunked entity
at scala.Predef$.require(Predef.scala:338)
at akka.http.scaladsl.model.HttpResponse.<init>(HttpMessage.scala:518)
at akka.http.scaladsl.model.HttpResponse.copyImpl(HttpMessage.scala:565)
at akka.http.scaladsl.model.HttpResponse.withEntity(HttpMessage.scala:543)
at kamon.instrumentation.akka.http.ServerFlowWrapper$$anon$1$$anon$2$$anon$5.onPush(ServerFlowWrapper.scala:164)
at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:541)
at akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:495)
at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:390)
at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:625)
at akka.stream.impl.fusing.GraphInterpreterShell$AsyncInput.execute(ActorGraphInterpreter.scala:502)
0.6.7
dated Jun 2017.
hi folks. have some questions that have bugged me for a while…
the context: is Monix-Reactive stream processing, where messages are buffered when back-pressure is detected; the source stream is grouped/partitioned (100 groups) and there is a back-pressure-buffer per group
i use a range-sampler to track the size of these buffers, incrementing by 1 as each event goes in, decrementing by the size of the batch of events that is released downstream i.e. when back-pressure is relaxed
also note that there are several threads across several instances running this stream
the questions:
1s
vs 1m
vs 5m
period/range alter the interpretation?1m
and the dashboard is set to 5m
would a sum
aggregation over report/represent the buffer-size i.e. will it be shown to be 5x bigger than it ever actually is?Hello guys, sorry for the silly simple question, our team is trying to massively bump Kamon from 1.x to latest, basically replacing:
def aspectjWeaver(version: String = "1.8.2"): ModuleID = "org.aspectj" % "aspectjweaver" % version % "java-agent"
def kamonCore(version: String = "1.1.6"): ModuleID = "io.kamon" %% "kamon-core" % version
def kamonSystemMetrics(version: String = "1.0.1"): ModuleID = "io.kamon" %% "kamon-system-metrics" % version
def kamonAkka25(version: String = "1.1.2"): ModuleID = "io.kamon" %% "kamon-akka-2.5" % version
def kamonAkkaHttp25(version: String = "1.1.0"): ModuleID = "io.kamon" %% "kamon-akka-http-2.5" % version
def kamonAkkaRemote25(version: String = "1.1.0"): ModuleID = "io.kamon" %% "kamon-akka-remote-2.5" % version
def kamonScala(version: String = "1.0.0-RC4"): ModuleID = "io.kamon" %% "kamon-scala" % version
def kamonPrometheus(version: String = "1.1.1"): ModuleID = "io.kamon" %% "kamon-prometheus" % version
def kamonJdbc(version: String = "1.0.2"): ModuleID = "io.kamon" %% "kamon-jdbc" % version
With simply
def kamonBundle(version: String = "2.1.14"): ModuleID = "io.kamon" %% "kamon-bundle" % version
def kamonPrometheus(version: String = "2.1.14"): ModuleID = "io.kamon" %% "kamon-prometheus" % version
From what I saw I have two questions:
2.5.22
does the Kamon bump require a more recent dependency? I am trying to estimate the work required for the bump, I can already see a lot of renamings!! Thanks in advance.
"io.kamon" %% "kamon-bundle" % "2.1.14",
"io.kamon" %% "kamon-datadog" % "2.1.14",
I am getting traces and system metrics but not getting any of my actor metrics like akka.actor.mailbox-size. Akka version 2.5.23. I currently even have the doomsday wildcard enabled instrumentation{
akka {
ask-pattern-timeout-warning = lightweight
filters {
actors {
doomsday-wildcard = on
track {
includes = ["**"]
...
Hello. I am having problems with my play framework (2.8.7) reporting spans to new relic. I have it working for my Akka HTTP app but for some reason the play app is only showing Metrics, not Spans. The Kanela agent is set up and I see the log below:
BatchDataSender configured with endpoint https://metric-api.newrelic.com/metric/v1
Is there any documentation on the kamon-mongo instrumentation (https://github.com/kamon-io/Kamon/tree/master/instrumentation/kamon-mongo)?
I included it in my application, but nothing is showing up in the status page or metrics. The code is hitting breakpoint in the MongoClientInstrumentation
class so I think its executing