extractRequestContext.flatMap { requestContext =>
val traceHeaderOp = requestContext.request.headers.find(_.name().toLowerCase == "x-salt-trace").map(_.value())
Kamon.withContext(Context(TrafficTracing.ContextKey, traceHeaderOp)) {
trace("checking")
mapRouteResult { result =>
trace(s"[$serviceName]: request processing finished")
log.info(s"[$serviceName]: request processing finished - ${result.toString}")
result
}
}
}
trace
method which reports events to datadog, the context is preserved, but inside the inner scope, the context is lost
> select time,max from "process.hiccups" order by time desc limit 3;
name: process.hiccups
time max
---- ---
1599474360000000000 25296896
1599474300000000000 10223616
1599474240000000000 35127296
> select time,max from "jvm.gc" order by time desc limit 3;
name: jvm.gc
time max
---- ---
1599474360000000000 8
1599474300000000000 7
1599474240000000000 7
my code has the following structure
logger.info("log1")
val f: Future[SomeData] = for {
res <- fetchSomeDataFromDb() //this op will be performed by different ec
_ = logger.info("log2")
} yield res
"log1" trace id is OK. it will be unique for each request
"log2" trace id is the same for all requests
Hello @/all :wave:
Today I would like to ask for a tiny bit of help from you: we are collecting Kamon testimonials to include on our website and, if you here on Gitter, there is a good chance that you are having fun with Kamon! Would you like to share your story?
Take 30 seconds to answer these questions and I'll personally get back to you to confirm the logo and/or setup a short interview to hear your story. Thanks a lot, this means a lot to us!
2020-09-15 21:22:00.043 [DEBUG] c.n.t.metrics.MetricBatchSender [New Relic Metric Reporter] [-] Sending a metric batch (number of metrics: 195) to the New Relic metric ingest endpoint)
2020-09-15 21:22:00.043 [DEBUG] c.n.t.m.json.MetricBatchMarshaller [New Relic Metric Reporter] [-] Generating json for metric batch.
2020-09-15 21:22:00.751 [DEBUG] c.n.t.transport.BatchDataSender [New Relic Metric Reporter] [-] Response from New Relic ingest API: code: 403, body: {}
2020-09-15 21:22:00.752 [WARN] c.n.t.transport.BatchDataSender [New Relic Metric Reporter] [-] Response from New Relic ingest API. Discarding batch recommended.: code: 403, body: {}
2020-09-15 21:22:00.754 [ERROR] kamon.module.ModuleRegistry [New Relic Metric Reporter] [-] Reporter [New Relic Metric Reporter] failed to process a metrics tick.
com.newrelic.telemetry.exceptions.DiscardBatchException: The New Relic API failed to process this request and it should not be retried.
at com.newrelic.telemetry.transport.BatchDataSender.sendPayload(BatchDataSender.java:134)
at com.newrelic.telemetry.transport.BatchDataSender.send(BatchDataSender.java:81)
at com.newrelic.telemetry.metrics.MetricBatchSender.sendBatch(MetricBatchSender.java:67)
at kamon.newrelic.metrics.NewRelicMetricsReporter.reportPeriodSnapshot(NewRelicMetricsReporter.scala:54)
at kamon.module.ModuleRegistry.$anonfun$scheduleMetricsTick$1(ModuleRegistry.scala:213)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
Hi all,
I'm using kamon-prometheus and I get issues getting labels applied. This configuration does not return labels:
kamon {
reporters = ["kamon.prometheus.PrometheusReporter"]
prometheus {
include-environment-tags = yes
environment{
service = "matcher"
env = "dev"
}
}
}
Curl on the exporter:
$ curl localhost:9095
# TYPE doc_count_total counter
doc_count_total 19901.0
# TYPE batch_count_total counter
batch_count_total 778.0
# TYPE assigned_partitions gauge
assigned_partitions 11.0
# TYPE handle_batch_seconds histogram
handle_batch_seconds_bucket{le="0.005"} 0.0
handle_batch_seconds_bucket{le="0.01"} 0.0
handle_batch_seconds_bucket{le="0.025"} 0.0
handle_batch_seconds_bucket{le="0.05"} 0.0
handle_batch_seconds_bucket{le="0.075"} 0.0
handle_batch_seconds_bucket{le="0.1"} 0.0
handle_batch_seconds_bucket{le="0.25"} 24.0
handle_batch_seconds_bucket{le="0.5"} 659.0
handle_batch_seconds_bucket{le="0.75"} 742.0
handle_batch_seconds_bucket{le="1.0"} 754.0
Hi All,
I am using kamon-prometheus (version 2.1.3) to send metrics to prometheus pushgateway.
I got issue that the message always empty. Here is my configuration
prometheus {
start-embedded-http-server = no
include-environment-tags = yes
pushgateway {
api-url = "http://stg-xxxx"
connect-timeout = 5 seconds
read-timeout = 5 seconds
write-timeout = 5 seconds
}
metric {
tick-interval = 1 seconds
}
}
This is the error message:
[2020-09-23 16:41:00,128] ERROR Failed to send metrics to Prometheus Pushgateway (kamon.prometheus.PrometheusPushgatewayReporter:44)
java.lang.Exception: Failed to POST metrics to Prometheus Pushgateway with status code [405], Body: [Method Not Allowed
]
at kamon.prometheus.HttpClient.doMethodWithBody(HttpClient.scala:40)
at kamon.prometheus.HttpClient.doPost(HttpClient.scala:25)
at kamon.prometheus.PrometheusPushgatewayReporter.reportPeriodSnapshot(PrometheusPushgatewayReporter.scala:43)
at kamon.module.ModuleRegistry$$anon$1.$anonfun$run$2(ModuleRegistry.scala:176)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:654)
at scala.util.Success.$anonfun$map$1(Try.scala:251)
at scala.util.Success.map(Try.scala:209)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:288)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run$$$capture(Promise.scala:60)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
instrumentation.akka.http {
server {
propagation {
enabled = false
}
tracing {
enabled = false
}
}
}
instrumentation.play.http {
server {
propagation {
enabled = false
}
tracing {
enabled = false
}
}
client {
propagation {
enabled = false
}
tracing {
enabled = false
}
}
}
instrumentation.http-server.default {
propagation {
enabled = false
}
tracing {
enabled = false
}
}
instrumentation.http-client.default {
propagation {
enabled = false
}
tracing {
enabled = false
}
}
Hi all, i am trying out this technology, but the Zipkin Server does not receive any data from my Scala Play App.
I have done the configuration for the Scala Play Framework:
// build.sbt
lazy val root = (project in file("."))
.enablePlugins(PlayScala, JavaAgent)
libraryDependencies ++= "io.kamon" %% "kamon-bundle" % "2.1.0",
// plugins.sbt
addSbtPlugin("io.kamon" % "sbt-kanela-runner-play-2.7" % "2.0.6")
// app.conf
kamon.zipkin {
# Hostname and port where the Zipkin Server is running
#
host = "localhost"
port = 9411
# Decides whether to use HTTP or HTTPS when connecting to Zipkin
protocol = "http"
}
And run a local zipkin server:
docker run -d -p 9411:9411 openzipkin/zipkin
But whenever I go on localhost:9411 i do not see any trace
Hi all, i am trying out this technology, but the Zipkin Server does not receive any data from my Scala Play App.
I have done the configuration for the Scala Play Framework:
// build.sbt lazy val root = (project in file(".")) .enablePlugins(PlayScala, JavaAgent) libraryDependencies ++= "io.kamon" %% "kamon-bundle" % "2.1.0", // plugins.sbt addSbtPlugin("io.kamon" % "sbt-kanela-runner-play-2.7" % "2.0.6") // app.conf kamon.zipkin { # Hostname and port where the Zipkin Server is running # host = "localhost" port = 9411 # Decides whether to use HTTP or HTTPS when connecting to Zipkin protocol = "http" }
And run a local zipkin server:
docker run -d -p 9411:9411 openzipkin/zipkin
But whenever I go on localhost:9411 i do not see any trace
I solved this by implementing Manual instrumentation. The automatic does not work... At least not for a simple Controller#action which does not do nothing
Hi, I'm trying to use kamon bundle (v2.1.4) in a akka project (akka v2.6.9) I'm usingjavaAgents += "io.kamon" % "kanela-agent" % "1.0.6"
in build.sbtaddSbtPlugin("io.kamon" % "sbt-kanela-runner" % "2.0.6")
in plugin.sbt
and Kamon.init() at the beginning of the main method, but I get this error
java.lang.ClassCastException: scala.util.Success cannot be cast to kamon.instrumentation.context.HasContext
I appreciate your help. Thank you