Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • May 06 14:47
    neko-kai closed #1478
  • May 06 14:11
    neko-kai synchronize #1478
  • May 06 02:35
    neko-kai synchronize #1478
  • May 05 21:13
    pshirshov synchronize #1444
  • May 05 15:24
    Caparow synchronize #1478
  • May 05 15:23
    Caparow opened #1478
  • May 05 15:23
    Caparow review_requested #1478
  • May 05 15:23
    Caparow review_requested #1478
  • May 04 10:20
    neko-kai closed #1477
  • May 04 10:03
    scala-steward review_requested #1477
  • May 04 10:03
    scala-steward review_requested #1477
  • May 04 10:03
    scala-steward opened #1477
  • May 04 08:37
    pshirshov assigned #1476
  • May 04 08:37
    pshirshov assigned #1476
  • May 04 08:37
    pshirshov opened #1476
  • May 03 04:12
    neko-kai closed #1475
  • May 03 04:12
    neko-kai review_requested #1475
  • May 03 04:12
    neko-kai review_requested #1475
  • May 03 04:12
    neko-kai opened #1475
  • May 03 04:11
    neko-kai closed #1474
Kai
@neko-kai
slf4j is usually backed by logback.
You may also write a custom LogSink with google cloud logging APIs
Valentin Willscher
@valenterry

Hey guys - what is the easiest way to change the output of the default IzLogger?
E.g. say instead of

[info] I 2020-08-21T11:00:23.984 (Main.scala:17)foo.Main.runApplication [21:ioapp-compute-0] Pizza is great

I would like to have

[info] I 2020-08-21T11:00 (Main.scala:17) […foo.Main.runApplication] Pizza is great

How do do that?

Kai
@neko-kai
@valenterry The easiest is to create a new ConsoleSink with a custom rendering policy. You can make a new rendering policy by passing a log template as an argument to new StringRenderingPolicy
Anton Semenov
@a7emenov
Could someone please take a look at 7mind/izumi#1201 ? The fix is fairly simple and I'd be happy to provide it myself, just want to be sure I'm not missing anything.
Kai
@neko-kai
@a7emenov Fix merged. Thanks for reporting the bug and bringing attention to it!
Anton Semenov
@a7emenov
@neko-kai cool! Could you please provide an estimate of when to expect a release with this fix included?
Kai
@neko-kai
@a7emenov will try to get it out today
Kai
@neko-kai
@a7emenov Released 0.10.19, you can grab it once it gets on central https://dev.azure.com/7mind/izumi/_build/results?buildId=3089&view=results
Anton Semenov
@a7emenov
Works like a charm, thanks.
Anton Solovyev
@Rosteelton
Hi! is there some way to start using BIO if I have a lot legacy with F[_]. For example:
trait OldOps[F[_]] {
  //Monad Error
  def old: F[Unit]
  //....many methods
}

trait New[F[+_, +_]] {
  def a: F[Throwable, Unit]
}

class NewImpl[F[+_, +_]: BIOMonadError](oldOps: OldOps[F]) extends New[F] {
  def a: F[Throwable, Unit] = oldOps.old ???
}
Kai
@neko-kai
@Rosteelton
You can use kind-projector's type lambda syntax *, to turn F[, ] into F[_] by partially the first argument to F (e.g. to Throwable):
class NewImpl[F[+_, +_]: BIOMonadError](
  oldOps: OldOps[F[Throwable, *]]
) extends New[F] {
  def a: F[Throwable, Unit] = oldOps.old
}
You'll need to add kind-projector to sbt to use this syntax, with the following line:
addCompilerPlugin("org.typelevel" % "kind-projector" % "0.11.0" cross CrossVersion.full)
Dermot Haughey
@hderms
is there a way to inject some JSON into any log that goes through logstage?
i'm trying to inject datadog ids
unfortunately i'm also on 0.6
do I have to extend the LogstageCirceRenderingPolicy
Kai
@neko-kai

@hderms Do you mean to inject Some Specific JSON into ALL the log messages? You can attach context parameters to logger using IzLogger#apply/IzLogger#withCustomContext and all the messages from that logger will include these parameters (https://izumi.7mind.io/latest/release/doc/logstage/index.html#log-algebras):

val newLogger =  logger("id" -> 1)
newLogger.info(s"abc cba")
// id=1 abc cba

In JSON the context fields will be under @context key (https://izumi.7mind.io/latest/release/doc/logstage/index.html#overview)
And can be arbitrary JSON. You can customize the JSON for each type with LogstageCodec implicit instances besides the default output

michealhill
@micheal-hill
Is it possible to materialise an Izumi Tag at runtime from a fqn of a class? Alternatively, is there some mechanism that I can use to serialise/deserialise the tag info?
Kai
@neko-kai

@micheal-hill You could try Java Serialization on a LightTypeTag value. Alternatively, every monomorphic tag is representable as 2 strings + 2 ints, but there's no mechanism yet to recover it from an existing ParsedLightTypeTag instance (sorry just hadn't thought about that ahead of time).
https://github.com/zio/izumi-reflect/blob/develop/izumi-reflect/izumi-reflect/src/main/scala/izumi/reflect/macrortti/LightTypeTag.scala#L249

You could make a PR for that to allow for efficient serialization/deserialization in new library version.

michealhill
@micheal-hill
Thanks!
michealhill
@micheal-hill
It looks like I can get everything I need from LightTypeTag, so I’ll probably go with Java Serialization on that - I don’t care about the details of the serialized value, other than to be able to deserialize it later.
Kai
@neko-kai
@micheal-hill Please report if that succeeds. I think the non-case class in LightTypeTag are missing the Serializable inheritance, so it could fail to work with Java serialization
Andreas Gabor
@an-tex
Hi! Are there any major changes in 1.0? GitHub just tells me "<Release notes pending>" ;) https://github.com/7mind/izumi/releases/tag/v1.0.0
Kai
@neko-kai

@an-tex
A lot of major changes and new features, most of them with some docs in the following sections:

Full release notes are indeed still pending...

There is also a slide deck for 1.0, but it contains no technical documentation https://github.com/7mind/slides/blob/master/10-izumi-1.0-functional-scala-2020/izumi-1.0.pdf
Andreas Gabor
@an-tex
awesome thanks!
Vladimir
@vladimir-lu
Thanks for the awesome library :)
Vladimir
@vladimir-lu
I'm looking into logging performance and noticed that currently the logstage RenderingPolicy is outputting a String - were there any considerations in not having this be Array[Byte] or similar?
Kai
@neko-kai
@vladimir-lu Performance wasn't a significant consideration beyond just being comfortably faster than logback/slf4j. If you're looking into it improving it further that would be much appreciated though!
Edvin Lundberg
@Edvin-san

Hi! I'm upgrading to distage 1.0.0 and noticed some macro dependencies when extending RoleAppMain.LauncherBIO2[IO] and RoleDescriptor.
Undefined macro parameter product-group, add ``-Xmacro-settings:product-group=<value>`` into ``scalac`` options [error] object MyRole extends RoleDescriptor {
Undefined macro parameter product-group, add ``-Xmacro-settings:product-group=<value>`` into ``scalac`` options [error] sealed abstract class MainBase(activation: Activation) extends RoleAppMain.LauncherBIO2[IO] {

I get similar errors for other macros. Eventually I have to add the following options as well as the sbt-git plugin.

          s"-Xmacro-settings:product-name=${name.value}",
          s"-Xmacro-settings:product-version=${version.value}",
          s"-Xmacro-settings:product-group=${organization.value}",
          s"-Xmacro-settings:sbt-version=${sbtVersion.value}",
          s"-Xmacro-settings:git-repo-clean=${git.gitUncommittedChanges.value}",
          s"-Xmacro-settings:git-branch=${git.gitCurrentBranch.value}",
          s"-Xmacro-settings:git-head-commit=${git.gitHeadCommit.value.getOrElse("")}",

I'm using scala 2.13.3 and sbt 1.3.8.
Are these dependencies included by design or something that accidentally slipped in?

Paul S.
@pshirshov
Yeah, this is by design. We had to support GraalVM and there is no good way to use manifests under graal
So now we just embed all the metadata directly into the code with a macro
I believe it's a good tradeoff for GraalVM/SJS support
Edvin Lundberg
@Edvin-san
Alright, thanks!
Kai
@neko-kai
I've answered on your github issue btw 7mind/izumi#1359 please comment if you experience any more issues
Edvin Lundberg
@Edvin-san
Thanks @neko-kai
I think I got that working (settled for the addHas solution). My wiring test now succeeds!
My next major challenge is when I try to run a test. I created 7mind/izumi#1365 for this.
Any help is appreciated!
Edvin Lundberg
@Edvin-san

Another question, we have a RunLocal launcher that is supposed to override kafka/postgres to run in container.
It used to look something like this:

object RunLocal extends RoleAppMain.Default(
      launcher = new RoleAppLauncher.LauncherBIO[Eff] {
        override val pluginConfig = PluginConfig.cached(packagesEnabled = Seq("com.company.plugins"))
        override val requiredActivations = Activation(Repo -> Repo.Prod)
        override protected def appOverride: ModuleBase = new ModuleDef {
          include(KafkaContainerModule)
          include(PostgresContainerModule)
        }
      },
    ) {
  override val requiredRoles = Vector(RawRoleParams(MyRole.id))
}

Now I've updated it to:

object RunLocal extends RoleAppMain.LauncherBIO2[IO] {
  override def requiredRoles(argv: RoleAppMain.ArgV): Vector[RawRoleParams] = Vector(RawRoleParams(MyRole.id))
  override def pluginConfig: PluginConfig = PluginConfig.cached(packagesEnabled = Seq("com.company.plugins"))
  override protected def roleAppBootOverrides(argv: RoleAppMain.ArgV): Module =
    super.roleAppBootOverrides(argv) ++ new ModuleDef {
      make[Activation].named("default").fromValue(Activation(Repo -> Repo.Prod)) // Not really sure what this does
      modify[Module].named("default")(_ ++ new ModuleDef {
        include(KafkaContainerModule)
        include(PostgresContainerModule)
        /*
        PostgresContainerModule:
        make[PostgreSQLContainer].fromResource(ContainerResource(() => new PostgreSQLContainer()))
        make[PostgresConfig].from { container: PostgreSQLContainer =>
            PostgresConfig(container.jdbcUrl, container.username, container.password)
        }
        */
      })
    }
}

I still get an error when running this:
DIConfigReadException: Couldn't read configuration type at path="pg" as type ``package::PostgresConfig``...

It seems it tries to make the PostgresConfig from the "real" config where the values are not defined locally.
But my override should fix this? What am I doing wrong?

Kai
@neko-kai

@Edvin-san
Try to change it like so:

object RunLocal extends RoleAppMain.LauncherBIO2[IO] {
  override def requiredRoles(argv: RoleAppMain.ArgV): Vector[RawRoleParams] = Vector(RawRoleParams(MyRole.id))

  override def pluginConfig: PluginConfig = {
    PluginConfig
      .cached(packagesEnabled = Seq("com.company.plugins"))
      .overriddenBy(new ModuleDef {
        include(KafkaContainerModule)
        include(PostgresContainerModule)
      })

  override protected def roleAppBootOverrides(argv: RoleAppMain.ArgV): Module =
    super.roleAppBootOverrides(argv) ++ new ModuleDef {
      make[Activation].named("default").fromValue(Activation(Repo -> Repo.Prod)) // Not really sure what this does
    }
}

Although I'm not entirely that would help. (You used (_ ++ _) instead of overriddenBy which could potentially cause make[PostgresConfig] to be shadowed by make[PostgresConfig].tagged(Repo.Prod) if such a binding also exists)

Edvin Lundberg
@Edvin-san
Ok, getting overloaded compile error:
def overriddenBy(plugins: Seq[PluginBase]): PluginConfig = copy(overrides = overrides ++ plugins)
def overriddenBy(plugin: PluginBase): PluginConfig = copy(overrides = overrides ++ Seq(plugin))
Added
object ContainerPlugin extends PluginDef {
  include(KafkaContainerModule)
  include(PostgresContainerModule)
}
...
override def pluginConfig: PluginConfig = {
    PluginConfig
      .cached(packagesEnabled = Seq("com.company.plugins"))
      .overriddenBy(ContainerPlugin)
Edvin Lundberg
@Edvin-san
Seems to work
Kai
@neko-kai
Ah, sorry, should've been new PluginDef { ... } instead of new ModuleDef { ... }
Kai
@neko-kai
@Edvin-san I've added a new chapter "Specificity and defaults" https://izumi.7mind.io/distage/basics.html#specificity-and-defaults that may explain why ++ didn't work above. (Well, iff I guessed correctly and there was a make[PostgresConfig].tagged(Repo.Prod) somewhere in plugins)
Daniel Esik
@danicheg
Hi there! Am I correctly got that when using 'distage-framework-docker' with enabled reuse policy it's not possible to configure killing containers after finishing tests? And containers can be killed only by hands (if I planning to use long-running SBT)?
Kai
@neko-kai
@danicheg Yeah. You may kill all distage-spawned containers with docker rm -f $(docker ps -q -a -f 'label=distage.type')
For a different strategy you may also disable reuse and use memoization in testkit - then the dockers will be started once and killed after finishing tests. To enable memoization add MyDocker.Container to TestConfig#memoizationRoots in testkit: https://izumi.7mind.io/latest/release/doc/distage/distage-testkit.html#resource-reuse-memoization (This is also showed in distage-example, where docker is memoized transitively as it is a dependency of repository impls https://github.com/7mind/distage-example/blob/develop/src/test/scala/leaderboard/tests.scala#L22)
Edvin Lundberg
@Edvin-san

@neko-kai

I've added a new chapter "Specificity and defaults" https://izumi.7mind.io/distage/basics.html#specificity-and-defaults that may explain why ++ didn't work above. (Well, iff I guessed correctly and there was a make[PostgresConfig].tagged(Repo.Prod) somewhere in plugins)

Nice, I will take a look! However, I should have mentioned that I could not find any Repo.Prod tag on the PostgresConfig. It is normally wired like makeConfig[PostgresConfig]("pg") using distage-extension-config.

Kai
@neko-kai

@Edvin-san It could look like this also:

class X extends ModuleDef {
  tag(Repo.Prod)

  makeConfig[PostgresConfig]("pg")
}

tag call in the body would apply .tagged to everything in the module except include's (no matter before or after the call).
Hmm but if there isn't something like that, than I wouldn't have ideas on what was wrong...