Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 16 10:36
    pjfanning opened #201
  • Jan 08 00:54
    andyczerwonka commented #199
  • Dec 28 2020 12:20
    nilskp opened #200
  • Dec 15 2020 21:52
    andyczerwonka commented #199
  • Dec 15 2020 21:48
    andyczerwonka edited #199
  • Dec 15 2020 21:43
    andyczerwonka commented #199
  • Dec 15 2020 21:35
    andyczerwonka edited #199
  • Dec 15 2020 21:31
    andyczerwonka edited #199
  • Dec 15 2020 21:12
    andyczerwonka edited #199
  • Dec 15 2020 21:11
    andyczerwonka commented #199
  • Dec 15 2020 21:04
    andyczerwonka commented #199
  • Dec 15 2020 21:02
    andyczerwonka commented #199
  • Dec 15 2020 21:02
    andyczerwonka edited #199
  • Dec 15 2020 20:48
    andyczerwonka commented #199
  • Dec 15 2020 19:45
    andyczerwonka edited #199
  • Dec 15 2020 19:45
    andyczerwonka opened #199
  • Dec 02 2020 16:39
    mrobakowski opened #198
  • Dec 02 2020 12:32
    Leammas commented #181
  • Dec 01 2020 14:02
    mrobakowski commented #181
  • Dec 01 2020 13:56
    mrobakowski commented #166
Jakub Kozłowski
@kubukoz
ah, so it's an ambiguous implicit now?
Calvin Lee Fernandes
@calvinlfer
the implementation is working great
but i cant get the tests to work ;(
Jakub Kozłowski
@kubukoz
do you have a branch?
Calvin Lee Fernandes
@calvinlfer
I’ll add you
Jakub Kozłowski
@kubukoz
cool, I'll take a look
Calvin Lee Fernandes
@calvinlfer
Thank you 🙏😄
Jakub Kozłowski
@kubukoz
not today though...
also, it seems like some weird macro shit I'm not really that familiar with
Calvin Lee Fernandes
@calvinlfer
Ah dang
no worries, whenever you get a chance. That would be a huge ergonomic improvement though
Calvin Lee Fernandes
@calvinlfer
hey all, I have a use case where I have a case class but I want to transform it into itself with Validation, I noticed that's not possible, are there any plans to support it?
  import cats.data.ValidatedNec
  import cats.implicits._
  import io.scalaland.chimney.dsl._
  import io.scalaland.chimney.cats._

    Example("Hello", 1)
      .into[Example]
      .withFieldComputedF[VN, String, String](_.a, e => (e.a + "!").invalidNec[String])
      .transform
    // Valid(Example("Hello", 1))  <-- :(
Piotr Krzemiński
@krzemin
You need to use lifted transformers as in the example here: https://scalalandio.github.io/chimney/transformers/lifted-transformers.html
Essentially you need to replace into with intoF and follow the types
Calvin Lee Fernandes
@calvinlfer
ah okay, thanks @krzemin :D
Calvin Lee Fernandes
@calvinlfer
any idea when a new release will be cut that includes the Ior changes?
Yuval Perry
@yperry
Hi,
Does chimney supports bidirectional or reverse mapping?
Piotr Krzemiński
@krzemin
@yperry nope, not yet, although we have a ticket for it
@calvinlfer I hope I will be able to trigger a release this week
Calvin Lee Fernandes
@calvinlfer
oh great, thanks @krzemin :D
Jakub Kozłowski
@kubukoz
Hey @krzemin, have you considered supporting scala-newtype in chimney?
Piotr Krzemiński
@krzemin
Hey @kubukoz, nope - I did not. I'm not using scala-newtype and don't have use cases for chimney integration. But feel free to create a ticket - I can eventually provide support for someone that wants to deliver a PR.
Piotr Krzemiński
@krzemin
Wojtek Pituła
@Krever
what's the status of chimney on dotty? Anyone tried or still to be investigated?
Piotr Krzemiński
@krzemin
Well, Dotty changes a lot for Chimney. As the library is mostly macro-based and Dotty comes up with brand new metaprogramming APIs, I think that ~90% of the code would need to be rewritten. It would be the best if we could provide the same user APIs and behavioral compatibility (at least for the most common use cases) to make Scala 2 -> Scala 3 transition easy for the Chimney users, but depending on some subtle details it might be either feasible way to go, or extremely unlikely.
Mateusz Kubuszok
@MateuszKubuszok
I think we were discussing this internally some time ago, and we basically decided to wait a bit to see if Dotty's metaprogramming API is stable
Dotty seems to be stable, but once in a while some breaking change is introduced and implementing against moving target (?) would be very demotivating
Piotr Krzemiński
@krzemin
Exactly. There is also a topic of dropping support for Scala 2.11 and Scalajs 0.6.x, which I would prefer to happen before we start supporting another language version.
Mikhail Sokolov
@migesok

Hi!
Probably a stupid question. I tried to write a generic implicit transformer for cats NonEmptyVector and so far derivation doesn't pick it up for me.
The transformer looks like this:

  implicit def nevToVectorTransformer[T, S](implicit
    elemTransformer: Transformer[T, S],
  ): Transformer[NonEmptyVector[T], Vector[S]] = (src: NonEmptyVector[T]) =>
    src.toVector.map(elemTransformer.transform)

  implicit def vectorToNevTransformer[T, S](implicit
    elemTransformer: Transformer[T, S],
  ): Transformer[Vector[T], NonEmptyVector[S]] = (src: Vector[T]) =>
    NonEmptyVector.fromVectorUnsafe(src.map(elemTransformer.transform))

I've even pulled element transformers "explicitly" into implicit scope, still no luck. Is there some kind of limitation in the macro derivation related to this?

Mateusz Kubuszok
@MateuszKubuszok
Did you import cats implicit instances into scope?
Hmm, actually all implicits provided are for TransformerFs
Piotr Krzemiński
@krzemin
Ahh, that seems to be another issue that comes again and again. When you define rule like that, the implicit search for elemtransformer: Transformer[T, S] will not invoke a macro. For now you need to have concrete instances as implicit vals/defs available in the implicit scope.
Mikhail Sokolov
@migesok

For now you need to have concrete instances as implicit vals/defs available in the implicit scope

So I need to have all required elemTransformer implicit instances defined explicitly in the scope, right?
Somehow it didn't work in my real code. At the same time when I tried to come up with a minimal example, I couldn't make it fail. I will dig dipper, maybe the issue is somewhere else.
My real code has those non-empty vectors nested inside a sealed trait hierarchy, with at least 2 levels of nesting.

That trick with pulling element transformers into the scope worked on my real code as well. I just needed to recompile it properly.
Thanks, guys!
Piotr Krzemiński
@krzemin
I'm going to drop support for Scala 2.11 in the next Chimney version. Not because it's fundamentally impossible, but it tends to increase maintenance burden over time. Does anyone have any strong argument against it?
Piotr Krzemiński
@krzemin
Nathan Marin
@NathanMarin

Hello!
I've got a question about a usecase I'm trying to solve but I'm not sure it's doable. I'd like to merge two case class into one, something like:

case class A(a: String)
case class B(b: String)
case class C(a: String, b: String)

val a = A("a")
val b = B("b")

// I'd like to merge a and b into an instance of C, something like this:
// val c = (a, b).transformInto[C]
// expected: C("a", "b")

I managed to make it work by making some fields optional in C and using patch but it feels hacky and I'd like to keep the same signatures as above:

case class A(a: String)
case class B(b: String)
case class C(a: String, b: Option[String]) // b is now an Option

val a = A("a")
val b = B("b")

val c = a.into[C].enableOptionDefaultsToNone.transform.patchUsing(b)
// works: C("a", Some("b"))

Would someone know if what I want to achieve is doable? Thanks in advance :)

Piotr Krzemiński
@krzemin
That’s not yet supported
1 reply
Marcin Szałomski
@baldram

Hello!
I have an issue while using Chimney in 2.13.3 Scala project (SBT 1.4.x).
As in mentioned project scala-library is provided in runtime, I exclude it from the package. Unfortunately after adding Chimeny, the autoScalaLibrary := false setting is ignored and Scala std lib gets added increasing the package by 5MB.
Is it something what might be considered as Chimney library's bug?

For the time being is there a better way to workaround this issue than adding Chimney dependency like below?
If not, is there any risk of doing the following?

libraryDependencies += "io.scalaland" %% "chimney" % "0.6.1" excludeAll(
    ExclusionRule(organization = "org.scala-lang"),
    ExclusionRule(organization = "org.scala-lang.modules")
  )
Marcin Szałomski
@baldram

I will add reproduction steps to above.

  1. Having Scala SBT project with autoScalaLibrary := false, I build the package with sbt clean assembly.
  2. The scala-library is not included as expected.
  3. Add "io.scalaland" %% "chimney" dependency.
  4. The scala-library is included ignoring autoScalaLibrary setting.

Is there a better solution for it thanExclusionRule? Shall I file an issue for this in Github?

Noe Alejandro Perez Dominguez
@osocron

Hello!
I've been trying to abstract some repetition using a function that looks like this:

type BasePropertyTransformer[T, C <: TransformerCfg] = TransformerInto[T, BaseProperty, C, TransformerFlags.Default]

  def transformLocationFields[T, C <: TransformerCfg](latitude: Option[BigDecimal],
                                                      longitude: Option[BigDecimal],
                                                      incRepresentation: Option[String],
                                                      adm3Name: Option[String],
                                                      adm2Name: Option[String],
                                                      adm1Name: Option[String],
                                                      postalCodeValue: Option[String],
                                                      addressRepr: Option[String])
                                                     (tr: BasePropertyTransformer[T, C]): BaseProperty = {
    val point = createPoint(latitude, longitude)
    tr.withFieldConst(_.inc_representation, incRepresentation)
      .withFieldConst(_.adm_3_name, adm3Name)
      .withFieldConst(_.adm_2_name, adm2Name)
      .withFieldConst(_.adm_1_name, adm1Name)
      .withFieldConst(_.postal_code_value, postalCodeValue)
      .withFieldConst(_.address_repr, addressRepr)
      .withFieldConst(_.latitude, latitude)
      .withFieldConst(_.longitude, longitude)
      .withFieldConst(_.property_geography, point.map(p => WKBWriter.toHex(wkbWriter.write(p))))
      .transform
  }

Which I then would like to use multiple times in other parts of the codebase:

val propertyTransformer =
      prop.into[BaseProperty]
        .withFieldConst(_.source_name, "fa")
        .withFieldRenamed(_.id, _.iqs_id)
        .withFieldComputed(_.property_id, p => SourceRecord.decimalFormat.format(p.property_id))
        .withFieldComputed(_.renovation_year, p => Option(p.renovation_year).map(_.toInt))
        .withFieldConst(_.gross_floor_area, grossFloorArea)

transformLocationFields(latitude, longitude, incRepresentation, adm3name, adm2name, adm1name, postalCodeValue, addressRepr)(propertyTransformer)

But I get an error like this

[error] BaseProperty.scala:113:8: Bad internal transformer config type shape!
[error]       .transform
[error]        ^

Do you know if this is possible or if I'm doing something wrong?

Piotr Krzemiński
@krzemin
Hi, are you sure you're using latest released version (0.6.1)? There was a fix to similar issue (scalalandio/chimney#194).
It that happens in 0.6.1, please submit bug report on github.
Noe Alejandro Perez Dominguez
@osocron

I was indeed not using the latest version. However, that did not make the error go away. I tried letting the compiler infer the types and that seemed to do the trick :)
Now I have something like this:

def transformLocationFields[T](from: T,
                               latitude: Option[BigDecimal],
                               longitude: Option[BigDecimal],
                               incRepresentation: Option[String],
                               adm3Name: Option[String],
                               adm2Name: Option[String],
                               adm1Name: Option[String],
                               postalCodeValue: Option[String],
                               addressRepr: Option[String]) = {
    val point = createPoint(latitude, longitude)
    from.into[BaseProperty]
      .withFieldConst(_.inc_representation, incRepresentation)
      .withFieldConst(_.adm_3_name, adm3Name)
      .withFieldConst(_.adm_2_name, adm2Name)
      .withFieldConst(_.adm_1_name, adm1Name)
      .withFieldConst(_.postal_code_value, postalCodeValue)
      .withFieldConst(_.address_repr, addressRepr)
      .withFieldConst(_.latitude, latitude)
      .withFieldConst(_.longitude, longitude)
      .withFieldConst(_.property_geography, point.map(p => WKBWriter.toHex(wkbWriter.write(p))))
  }

And then when using it I do this:

transformLocationFields(prop, latitude, longitude, incRepresentation, adm3name, adm2name, adm1name, postalCodeValue, addressRepr)
      .withFieldConst(_.source_name, "fa")
      .withFieldRenamed(_.id, _.iqs_id)
      .withFieldComputed(_.property_id, p => SourceRecord.decimalFormat.format(p.property_id))
      .withFieldComputed(_.renovation_year, p => Option(p.renovation_year).map(_.toInt))
      .withFieldConst(_.gross_floor_area, grossFloorArea)
      .transform
Marcin Szałomski
@baldram

after adding Chimeny, the autoScalaLibrary := false setting is ignored

Hi, no response for the issue I reported. I think I see what might be the root cause.
The same happens if I add scala-collection-compat and this dependency is used by Chimney.
However, I see interesting information from @krzemin : "I'm going to drop support for Scala 2.11 in the next Chimney version. [...] Does anyone have any strong argument against it?".
Excellent. For my side one more reason to uphold this decision. Is it literally planned for the next 0.6.2 release?

Piotr Krzemiński
@krzemin
@baldram sorry for no response, I've never played with autoScalaLibrary - if that happens to be Chimney issue, I'm happy to merge PR.
Regarding 2.11 drop, it was already done in 0.6.0 (https://github.com/scalalandio/chimney/releases/tag/0.6.0), but that won't eliminate scala-collection-compat as it's required for easy 2.12/2.13 interop.
Marcin Szałomski
@baldram
Hi Piotr! Thank you for response. Ah, I see 2.11 drop, but 2.12 stays with us, ok. Then I will keep being careful while assembling with the ExclusionRule or the other way if I find. Btw.I'm happy to see the discussion on Scala 3 transition. One point (2.11 drop) from that discussion is done already :smirk: (https://gitter.im/scalalandio/chimney?at=5f1ec760b1409c060f887361)
Thank you for clarification!
Andy Czerwonka
@andyczerwonka
@krzemin Let's say I have two sealed trait hierarchies A and B. When I want to go from A => B, I can use transforInto. Most of the concrete logic looks the same, but I want to inject logic into one of the concrete type transformations. How would I go about doing that? E.g. https://scastie.scala-lang.org/andyczerwonka/qkeYbqp8T4aWJWZPGOoEoA/6
24 replies
Bendix Sältz
@saeltz
Moin guys, following up on the conversation from July on Scala 3 support. Do you expect the metaprogramming API to be stable enough now with M3 to start implementing against it? What are your plans? Thanks.