Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Ivan Topolnjak
@ivantopo
ok!
btw
depending on how complicated are the rules, you might be able to get the name you want just by using configuration
Thomas Ploch
@tPl0ch
@ivantopo Might it be related to us using a custom Avro Serializer when sending messages? Would we need to add the traces and spans in our serializers?
Peter Nerg
@pnerg

I'll peek at that too, though I fear config might be complicated as it's about slicing the URL according to a json api spec

/api/<resource>/<id> -> <resource>
/api/<resource>/<id>/<relationship> -> <resource>-<relationship>
/api/<servicefunction> -> <servicefunction>
The code in Scala is very simple so I would love to just migrate it as opposed to try to config it using complicated rules, it it even would be possible

Ivan Topolnjak
@ivantopo
ok ok
SahilAggarwalG
@SahilAggarwalG
@dpsoft -Dkanela.modules.executor-service-capture-on-submit.enabled=true , abov e falg is experimental as seen from refrence.conf of the jar , is it Ok to us it in production
Diego Parra
@dpsoft
@SahilAggarwalG currently we have some services running in production for several months without issues with the flag activated. We need adjust some things but I think is safe in the majority of the uses cases.
danischroeter
@danischroeter

Is the akka remote Kamon module supporting Artery? I don't see traces being joined on other nodes.
we didn't get around that just yet :/

holy c#$%*
I was looking at issues why 2.0 does not work anymore :( Very frustrating.
Would be nice to document what is not supposed to work...
@ivantopo How can this issue be tracked? - @tPl0ch said he would do a PR - I can open an issue if that helps. Maybe I can also lend a hand but the migration was already quiet costly...
I guess I need to stop the migration to 2.0 for now...
Btw: artery becomes the default in the upcoming akka 2.6...

Samuel
@garraspin
@mladens when do you think you will have kamon-http4s ready?
Mladen Subotić
@mladens
expecting to make a PR sometime today/tomorrow
Samuel
@garraspin
I started a PR but im stuck in fixing HttpMetricsSpec, it looks like there was a trait MetricInspection with a bunch of type classes that are gone but I cant find a replacement
Mladen Subotić
@mladens
it moved to InstrumentInspection.Syntax
Ivan Topolnjak
@ivantopo
@danischroeter @tPl0ch regarding Artery support: kamon-io/kamon-akka#58
please subscribe to that one
danischroeter
@danischroeter
:thumbsup:
Joe Martinez
@JoePercipientAI
@mladens I assume you meant kanela.debug-mode = true. I tried it set to both true and false, and in neither case do I see anything about Kamon in the logs, other than the banner.
Mladen Subotić
@mladens
yup, sorry my bad, debug-mode = true, also forgot to mention to drop the log level kanela.log-level = "DEBUG"
Joe Martinez
@JoePercipientAI
@mladens I just tried that, and it didn't make a difference. I still get the same output in the logs, and there are no DEBUG log entries at all.
Also can you confirm... Does the "Kanela" config section go under "Kamon", or is it its own root level?
Joe Martinez
@JoePercipientAI
Ok, that's what I did.
Mladen Subotić
@mladens
try logging the resulting conf, seems like its getting lost in the merge
gmim
@gilshoshan17_twitter

hi. I'm getting java.lang.NullPointerException when init Kamon 2.0 . I'm using my one thread pool (extends ExecutorServiceFactory) the issue is in CaptureActorSystemNameOnExecutorConfigurator object. it looks like its cant support any custom made thread pool. is this the case?
code

[ERROR] [09/12/2019 18:07:13.430] [myApp1-actor-system-akka.actor.default-dispatcher-5] [akka.dispatch.Dispatcher] null
java.lang.NullPointerException
at java.util.regex.Matcher.getTextLength(Matcher.java:1283)
at java.util.regex.Matcher.reset(Matcher.java:309)
at java.util.regex.Matcher.<init>(Matcher.java:229)
at java.util.regex.Pattern.matcher(Pattern.java:1093)
at kamon.util.Filter$Glob.accept(Filter.scala:197)
at kamon.util.Filter$IncludeExclude.kamon$util$Filter$IncludeExclude

KaTeX parse error: Can't use function '$' in math mode at position 1: $̲anonfun$2(Filte…: $anonfun$2(Filter.scala:140)
    at kamon.util.Filter$IncludeExclude$lambda
includes$1.apply(Filter.scala:140)
at kamon.util.Filter$IncludeExclude$lambda
KaTeX parse error: Can't use function '$' in math mode at position 9: includes$̲1.apply(Filter.…: includes$1.apply(Filter.scala:140)
    at scala.collection.LinearSeqOptimized$class.exists(LinearSeqOptimized.scala:93)
    at scala.collection.immutable.List.exists(List.scala:84)
    at kamon.util.Filter$IncludeExclude.includes(Filter.scala:140)
    at kamon.util.Filter$IncludeExclude.accept(Filter.scala:137)
    at kamon.instrumentation.akka.instrumentations.InstrumentNewExecutorServiceOnAkka25$.around(DispatcherInstrumentation.scala:155)
    at akka.dispatch.CachedThreadPoolExecutorServiceFactory.createExecutorService(CachedThreadPoolExecutorConfigurator.scala)
    at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.executor$lzycompute(Dispatcher.scala:43)
    at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.executor(Dispatcher.scala:43)
    at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:217)
    at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:42)
    at akka.dispatch.Dispatcher.executeTask(Dispatcher.scala:80)
    at akka.dispatch.MessageDispatcher.unbatchedExecute(AbstractDispatcher.scala:154)
    at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:122)
    at akka.dispatch.MessageDispatcher.execute(AbstractDispatcher.scala:88)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
    at scala.concurrent.Promise$class.complete(Promise.scala:55)
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157)
    at akka.http.impl.engine.client.PoolInterfaceActor
anonfun$receive$1.applyOrElse(PoolInterfaceActor.scala:129)
at akka.actor.Actor$class.aroundReceive(Actor.scala:539)
at akka.http.impl.engine.client.PoolInterfaceActor.akka$stream$actor$ActorSubscriber
KaTeX parse error: Can't use function '$' in math mode at position 6: super$̲aroundReceive(P…: super$aroundReceive(PoolInterfaceActor.scala:68)
    at akka.stream.actor.ActorSubscriber$class.aroundReceive(ActorSubscriber.scala:191)
    at akka.http.impl.engine.client.PoolInterfaceActor.akka$stream$actor$ActorPublisher
super$aroundReceive(PoolInterfaceActor.scala:68)
at akka.stream.actor.ActorPublisher$class.aroundReceive(ActorPublisher.scala:350)
at akka.http.impl.engine.client.PoolInterfaceActor.aroundReceive(PoolInterfaceActor.scala:68)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:612)
at akka.actor.ActorCell.invoke(ActorCell.scala:581)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:268)
at akka.dispatch.Mailbox.run(Mailbox.scala:229)
at kamon.instrumentation.executor.ExecutorInstrumentation$InstrumentedForkJoinPool$TimingRunnable.run(ExecutorInstrumentation.scala:653)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:49)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
code

Joe Martinez
@JoePercipientAI
@mladens Here is the Kanela section of the logged config:
  "kanela": {
    "circuit-breaker": {
      "enabled": false,
      "free-memory-threshold": 20,
      "gc-process-cpu-threshold": 10
    },
    "class-dumper": {
      "create-jar": true,
      "dir": "\/home\/joe\/kanela-agent\/dump",
      "enabled": false,
      "jar-name": "instrumented-classes"
    },
    "class-replacer": {
      "replace": [
        "kamon.status.Status$Instrumentation$=>kanela.agent.util.KanelaInformationProvider"
      ]
    },
    "debug-mode": true,
    "gc-listener": {
      "log-after-gc-run": false
    },
    "instrumentation-registry": {
      "enabled": true
    },
    "log-level": "DEBUG",
    "modules": {
      "executor-service": {
        "within": [
          "^slick.*"
        ]
      },
      "jdbc": {
        "description": "Provides instrumentation for JDBC statements, Slick AsyncExecutor and the Hikari connection pool",
        "instrumentations": [
          "kamon.instrumentation.jdbc.StatementInstrumentation",
          "kamon.instrumentation.jdbc.HikariInstrumentation"
        ],
        "name": "JDBC Instrumentation",
        "within": [
          "^org.h2..*",
          "^org.sqlite..*",
          "^oracle.jdbc..*",
          "^com.amazon.redshift.jdbc42..*",
          "^com.amazon.redshift.core.jdbc42..*",
          "^com.mysql.jdbc..*",
          "^com.mysql.cj.jdbc..*",
          "^org.h2.Driver",
          "^org.h2.jdbc..*",
          "^net.sf.log4jdbc..*",
          "^org.mariadb.jdbc..*",
          "^org.postgresql.jdbc..*",
          "^com.microsoft.sqlserver.jdbc..*",
          "^com.zaxxer.hikari.pool.PoolBase",
          "^com.zaxxer.hikari.pool.PoolEntry",
          "^com.zaxxer.hikari.pool.HikariPool",
          "^com.zaxxer.hikari.pool.ProxyConnection",
          "^com.zaxxer.hikari.pool.HikariProxyStatement",
          "^com.zaxxer.hikari.pool.HikariProxyPreparedStatement",
          "^com.zaxxer.hikari.pool.HikariProxyCallableStatement"
        ]
      }
    },
    "show-banner": true
  }
Mladen Subotić
@mladens
Are you seeing any logfiles being generated by kanela? There should be some if debug mode is enabled.
Joe Martinez
@JoePercipientAI
I'm just looking at the console. Where would I find log files?
Mladen Subotić
@mladens
probably jvm work dir
Joe Martinez
@JoePercipientAI
@mladens Sorry, do you know how I can find out where that dir is on my system? Or what the log files would be named?
I'm on Linux
Ivan Topolnjak
@ivantopo
@gilshoshan17_twitter it looks like you have a $ symbol in your filters! if it is there by mistake please remove it or if you actually wanted to have a regex in the filter make sure that the pattern starts with regex: and then the expression
Mladen Subotić
@mladens
@JoePercipientAI i just ran an app with kanela in debug and it produced kanela-agent.2019-09-12 17-37-19.log in the project root dir
Joe Martinez
@JoePercipientAI
@mladens Thanks. I just verified that if I run my non-Spark test app with that configuration, I DO get the log file in the project root dir. But not for my Spark app
Enrico Benini
@ebenini-mdsol

Hi there,
I attach in here a simple project we crafted at work that shows how we get different output based on the appender type FileAppender and AsyncAppender into the logback.

If you take the project and do an sbt run changing the value of appender-ref (line 25) once with FILE and then with ASYNCFILE you can see that the log/app.log in one case just returns empty and the other it prints the span operationName.

We guess it can be due to some bugs into the kamon-logback or the kanela-agent.
Could you please have a look at the code and let us know?
Thank you so much

ps: I will also raise an issue on github for traceability ;)
Enrico Benini
@ebenini-mdsol
gmim
@gilshoshan17_twitter
@ivantopo hi Ivan. i dont have any $ sign in my filters. if my own custom thread pool is not one of the common threadpool configurator, the Name field will be null (it will never be set) and then the InstrumentNewExecutorServiceOnAkka.around method wil get null name, and from there the road to nullPointerException is short.
Thomas Ploch
@tPl0ch
Could it be that the Kamon Datadog reporter is sending traces and spans not as strings? In my logs I have hex number strings in dd.trace_id and dd.span_id, but the traces use large integers, so no correlation can be made with errenuous traces.
Thomas Ploch
@tPl0ch
I mean logs and traces cannot be correlated.
Thomas Ploch
@tPl0ch
Yup, so kamon-datadog converts the hex string to a BigInt while kamon-logback does not. Shouldn't these be aligned? Meaning the string representation of a trace or span ID should be consistent over all kamon instrumentations.
Thomas Ploch
@tPl0ch
I am thinking of adding a configuration to kamon-datadog to not convert to BigDecimal but just use hex number strings in order to align log correlation, @ivantopo any thoughts on this?
Hmm OK, so the trace endpoint in datadog uses 64-bit unsigned integers for IDs regarding the spec, so adding it to logback is probably the way to go. Is there a possibility to convert values in logback directly?
Sakis Karagiannis
@AlterEgo7

hello everyone, I am trying to use kamon-bundle along with a datadog exporter, on an akka-http app. However I’m getting the following error:

java.lang.NoClassDefFoundError: Could not initialize class scala.concurrent.Future$
    at akka.http.impl.util.StreamUtils$CaptureTerminationOp$.<init>(StreamUtils.scala:281)
    at akka.http.impl.util.StreamUtils$CaptureTerminationOp$.<clinit>(StreamUtils.scala)
    at akka.http.scaladsl.model.HttpEntity$.captureTermination(HttpEntity.scala:672)

Has anyone seen this before?

Thomas Ploch
@tPl0ch
Yes, me. Just recently. It happens when Kamon.init() is not the first expression that is evaluated when the application starts.
@AlterEgo7 in my case it was because our Main object was extending App which initializes Futures already. You can just create a trait KamonInit and extend this as the first member instead of App.
Sakis Karagiannis
@AlterEgo7
@tPl0ch thanks a lot, I saw it in the documentation now as well. It might make sense for that exact phrase to be in bold in the site documentation as the compiler error is completely unrelated and confusing
crow-fff
@crow-fff

It happens when Kamon.init() is not the first expression that is evaluated when the application starts.

And because of that Kamon.init(customConfig) doesn't work too.

Ivan Topolnjak
@ivantopo
@tPl0ch as far as I understand the fact that trace IDs are converted to numbers is because of the format received by the DD agent