Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jun 14 19:27
    stephendavidmarsh opened #211
  • Jun 09 04:47
    ghostdogpr opened #210
  • May 19 11:50
    malteneuss opened #209
  • May 03 00:20
    alexklibisz opened #208
  • Apr 05 15:37
    edio edited #206
  • Apr 05 14:55
    edio commented #206
  • Apr 05 14:42
    codecov[bot] commented #206
  • Apr 05 14:41
    codecov[bot] commented #206
  • Apr 05 14:41
    codecov[bot] commented #206
  • Apr 05 14:41
    codecov[bot] commented #206
  • Apr 05 14:38
    codecov[bot] commented #206
  • Apr 05 14:38
    codecov[bot] commented #206
  • Apr 05 14:38
    edio synchronize #206
  • Apr 05 14:38
    codecov[bot] commented #206
  • Apr 05 14:38
    codecov[bot] commented #206
  • Apr 05 14:37
    codecov[bot] commented #206
  • Apr 05 14:37
    codecov[bot] commented #206
  • Apr 05 14:35
    codecov[bot] commented #206
  • Apr 05 14:35
    codecov[bot] commented #206
  • Apr 05 14:35
    edio synchronize #206
Jakub Kozłowski
@kubukoz
also, it seems like some weird macro shit I'm not really that familiar with
Calvin Lee Fernandes
@calvinlfer
Ah dang
no worries, whenever you get a chance. That would be a huge ergonomic improvement though
Calvin Lee Fernandes
@calvinlfer
hey all, I have a use case where I have a case class but I want to transform it into itself with Validation, I noticed that's not possible, are there any plans to support it?
  import cats.data.ValidatedNec
  import cats.implicits._
  import io.scalaland.chimney.dsl._
  import io.scalaland.chimney.cats._

    Example("Hello", 1)
      .into[Example]
      .withFieldComputedF[VN, String, String](_.a, e => (e.a + "!").invalidNec[String])
      .transform
    // Valid(Example("Hello", 1))  <-- :(
Piotr Krzemiński
@krzemin
You need to use lifted transformers as in the example here: https://scalalandio.github.io/chimney/transformers/lifted-transformers.html
Essentially you need to replace into with intoF and follow the types
Calvin Lee Fernandes
@calvinlfer
ah okay, thanks @krzemin :D
Calvin Lee Fernandes
@calvinlfer
any idea when a new release will be cut that includes the Ior changes?
Yuval Perry
@yperry
Hi,
Does chimney supports bidirectional or reverse mapping?
Piotr Krzemiński
@krzemin
@yperry nope, not yet, although we have a ticket for it
@calvinlfer I hope I will be able to trigger a release this week
Calvin Lee Fernandes
@calvinlfer
oh great, thanks @krzemin :D
Jakub Kozłowski
@kubukoz
Hey @krzemin, have you considered supporting scala-newtype in chimney?
Piotr Krzemiński
@krzemin
Hey @kubukoz, nope - I did not. I'm not using scala-newtype and don't have use cases for chimney integration. But feel free to create a ticket - I can eventually provide support for someone that wants to deliver a PR.
Piotr Krzemiński
@krzemin
Wojtek Pituła
@Krever
what's the status of chimney on dotty? Anyone tried or still to be investigated?
Piotr Krzemiński
@krzemin
Well, Dotty changes a lot for Chimney. As the library is mostly macro-based and Dotty comes up with brand new metaprogramming APIs, I think that ~90% of the code would need to be rewritten. It would be the best if we could provide the same user APIs and behavioral compatibility (at least for the most common use cases) to make Scala 2 -> Scala 3 transition easy for the Chimney users, but depending on some subtle details it might be either feasible way to go, or extremely unlikely.
Mateusz Kubuszok
@MateuszKubuszok
I think we were discussing this internally some time ago, and we basically decided to wait a bit to see if Dotty's metaprogramming API is stable
Dotty seems to be stable, but once in a while some breaking change is introduced and implementing against moving target (?) would be very demotivating
Piotr Krzemiński
@krzemin
Exactly. There is also a topic of dropping support for Scala 2.11 and Scalajs 0.6.x, which I would prefer to happen before we start supporting another language version.
Mikhail Sokolov
@migesok

Hi!
Probably a stupid question. I tried to write a generic implicit transformer for cats NonEmptyVector and so far derivation doesn't pick it up for me.
The transformer looks like this:

  implicit def nevToVectorTransformer[T, S](implicit
    elemTransformer: Transformer[T, S],
  ): Transformer[NonEmptyVector[T], Vector[S]] = (src: NonEmptyVector[T]) =>
    src.toVector.map(elemTransformer.transform)

  implicit def vectorToNevTransformer[T, S](implicit
    elemTransformer: Transformer[T, S],
  ): Transformer[Vector[T], NonEmptyVector[S]] = (src: Vector[T]) =>
    NonEmptyVector.fromVectorUnsafe(src.map(elemTransformer.transform))

I've even pulled element transformers "explicitly" into implicit scope, still no luck. Is there some kind of limitation in the macro derivation related to this?

Mateusz Kubuszok
@MateuszKubuszok
Did you import cats implicit instances into scope?
Hmm, actually all implicits provided are for TransformerFs
Piotr Krzemiński
@krzemin
Ahh, that seems to be another issue that comes again and again. When you define rule like that, the implicit search for elemtransformer: Transformer[T, S] will not invoke a macro. For now you need to have concrete instances as implicit vals/defs available in the implicit scope.
Mikhail Sokolov
@migesok

For now you need to have concrete instances as implicit vals/defs available in the implicit scope

So I need to have all required elemTransformer implicit instances defined explicitly in the scope, right?
Somehow it didn't work in my real code. At the same time when I tried to come up with a minimal example, I couldn't make it fail. I will dig dipper, maybe the issue is somewhere else.
My real code has those non-empty vectors nested inside a sealed trait hierarchy, with at least 2 levels of nesting.

That trick with pulling element transformers into the scope worked on my real code as well. I just needed to recompile it properly.
Thanks, guys!
Piotr Krzemiński
@krzemin
I'm going to drop support for Scala 2.11 in the next Chimney version. Not because it's fundamentally impossible, but it tends to increase maintenance burden over time. Does anyone have any strong argument against it?
Piotr Krzemiński
@krzemin
Nathan Marin
@NathanMarin

Hello!
I've got a question about a usecase I'm trying to solve but I'm not sure it's doable. I'd like to merge two case class into one, something like:

case class A(a: String)
case class B(b: String)
case class C(a: String, b: String)

val a = A("a")
val b = B("b")

// I'd like to merge a and b into an instance of C, something like this:
// val c = (a, b).transformInto[C]
// expected: C("a", "b")

I managed to make it work by making some fields optional in C and using patch but it feels hacky and I'd like to keep the same signatures as above:

case class A(a: String)
case class B(b: String)
case class C(a: String, b: Option[String]) // b is now an Option

val a = A("a")
val b = B("b")

val c = a.into[C].enableOptionDefaultsToNone.transform.patchUsing(b)
// works: C("a", Some("b"))

Would someone know if what I want to achieve is doable? Thanks in advance :)

Piotr Krzemiński
@krzemin
That’s not yet supported
1 reply
Marcin Szałomski
@baldram

Hello!
I have an issue while using Chimney in 2.13.3 Scala project (SBT 1.4.x).
As in mentioned project scala-library is provided in runtime, I exclude it from the package. Unfortunately after adding Chimeny, the autoScalaLibrary := false setting is ignored and Scala std lib gets added increasing the package by 5MB.
Is it something what might be considered as Chimney library's bug?

For the time being is there a better way to workaround this issue than adding Chimney dependency like below?
If not, is there any risk of doing the following?

libraryDependencies += "io.scalaland" %% "chimney" % "0.6.1" excludeAll(
    ExclusionRule(organization = "org.scala-lang"),
    ExclusionRule(organization = "org.scala-lang.modules")
  )
Marcin Szałomski
@baldram

I will add reproduction steps to above.

  1. Having Scala SBT project with autoScalaLibrary := false, I build the package with sbt clean assembly.
  2. The scala-library is not included as expected.
  3. Add "io.scalaland" %% "chimney" dependency.
  4. The scala-library is included ignoring autoScalaLibrary setting.

Is there a better solution for it thanExclusionRule? Shall I file an issue for this in Github?

Noe Alejandro Perez Dominguez
@osocron

Hello!
I've been trying to abstract some repetition using a function that looks like this:

type BasePropertyTransformer[T, C <: TransformerCfg] = TransformerInto[T, BaseProperty, C, TransformerFlags.Default]

  def transformLocationFields[T, C <: TransformerCfg](latitude: Option[BigDecimal],
                                                      longitude: Option[BigDecimal],
                                                      incRepresentation: Option[String],
                                                      adm3Name: Option[String],
                                                      adm2Name: Option[String],
                                                      adm1Name: Option[String],
                                                      postalCodeValue: Option[String],
                                                      addressRepr: Option[String])
                                                     (tr: BasePropertyTransformer[T, C]): BaseProperty = {
    val point = createPoint(latitude, longitude)
    tr.withFieldConst(_.inc_representation, incRepresentation)
      .withFieldConst(_.adm_3_name, adm3Name)
      .withFieldConst(_.adm_2_name, adm2Name)
      .withFieldConst(_.adm_1_name, adm1Name)
      .withFieldConst(_.postal_code_value, postalCodeValue)
      .withFieldConst(_.address_repr, addressRepr)
      .withFieldConst(_.latitude, latitude)
      .withFieldConst(_.longitude, longitude)
      .withFieldConst(_.property_geography, point.map(p => WKBWriter.toHex(wkbWriter.write(p))))
      .transform
  }

Which I then would like to use multiple times in other parts of the codebase:

val propertyTransformer =
      prop.into[BaseProperty]
        .withFieldConst(_.source_name, "fa")
        .withFieldRenamed(_.id, _.iqs_id)
        .withFieldComputed(_.property_id, p => SourceRecord.decimalFormat.format(p.property_id))
        .withFieldComputed(_.renovation_year, p => Option(p.renovation_year).map(_.toInt))
        .withFieldConst(_.gross_floor_area, grossFloorArea)

transformLocationFields(latitude, longitude, incRepresentation, adm3name, adm2name, adm1name, postalCodeValue, addressRepr)(propertyTransformer)

But I get an error like this

[error] BaseProperty.scala:113:8: Bad internal transformer config type shape!
[error]       .transform
[error]        ^

Do you know if this is possible or if I'm doing something wrong?

Piotr Krzemiński
@krzemin
Hi, are you sure you're using latest released version (0.6.1)? There was a fix to similar issue (scalalandio/chimney#194).
It that happens in 0.6.1, please submit bug report on github.
Noe Alejandro Perez Dominguez
@osocron

I was indeed not using the latest version. However, that did not make the error go away. I tried letting the compiler infer the types and that seemed to do the trick :)
Now I have something like this:

def transformLocationFields[T](from: T,
                               latitude: Option[BigDecimal],
                               longitude: Option[BigDecimal],
                               incRepresentation: Option[String],
                               adm3Name: Option[String],
                               adm2Name: Option[String],
                               adm1Name: Option[String],
                               postalCodeValue: Option[String],
                               addressRepr: Option[String]) = {
    val point = createPoint(latitude, longitude)
    from.into[BaseProperty]
      .withFieldConst(_.inc_representation, incRepresentation)
      .withFieldConst(_.adm_3_name, adm3Name)
      .withFieldConst(_.adm_2_name, adm2Name)
      .withFieldConst(_.adm_1_name, adm1Name)
      .withFieldConst(_.postal_code_value, postalCodeValue)
      .withFieldConst(_.address_repr, addressRepr)
      .withFieldConst(_.latitude, latitude)
      .withFieldConst(_.longitude, longitude)
      .withFieldConst(_.property_geography, point.map(p => WKBWriter.toHex(wkbWriter.write(p))))
  }

And then when using it I do this:

transformLocationFields(prop, latitude, longitude, incRepresentation, adm3name, adm2name, adm1name, postalCodeValue, addressRepr)
      .withFieldConst(_.source_name, "fa")
      .withFieldRenamed(_.id, _.iqs_id)
      .withFieldComputed(_.property_id, p => SourceRecord.decimalFormat.format(p.property_id))
      .withFieldComputed(_.renovation_year, p => Option(p.renovation_year).map(_.toInt))
      .withFieldConst(_.gross_floor_area, grossFloorArea)
      .transform
Marcin Szałomski
@baldram

after adding Chimeny, the autoScalaLibrary := false setting is ignored

Hi, no response for the issue I reported. I think I see what might be the root cause.
The same happens if I add scala-collection-compat and this dependency is used by Chimney.
However, I see interesting information from @krzemin : "I'm going to drop support for Scala 2.11 in the next Chimney version. [...] Does anyone have any strong argument against it?".
Excellent. For my side one more reason to uphold this decision. Is it literally planned for the next 0.6.2 release?

Piotr Krzemiński
@krzemin
@baldram sorry for no response, I've never played with autoScalaLibrary - if that happens to be Chimney issue, I'm happy to merge PR.
Regarding 2.11 drop, it was already done in 0.6.0 (https://github.com/scalalandio/chimney/releases/tag/0.6.0), but that won't eliminate scala-collection-compat as it's required for easy 2.12/2.13 interop.
Marcin Szałomski
@baldram
Hi Piotr! Thank you for response. Ah, I see 2.11 drop, but 2.12 stays with us, ok. Then I will keep being careful while assembling with the ExclusionRule or the other way if I find. Btw.I'm happy to see the discussion on Scala 3 transition. One point (2.11 drop) from that discussion is done already :smirk: (https://gitter.im/scalalandio/chimney?at=5f1ec760b1409c060f887361)
Thank you for clarification!
Andy Czerwonka
@andyczerwonka
@krzemin Let's say I have two sealed trait hierarchies A and B. When I want to go from A => B, I can use transforInto. Most of the concrete logic looks the same, but I want to inject logic into one of the concrete type transformations. How would I go about doing that? E.g. https://scastie.scala-lang.org/andyczerwonka/qkeYbqp8T4aWJWZPGOoEoA/6
24 replies
Bendix Sältz
@saeltz
Moin guys, following up on the conversation from July on Scala 3 support. Do you expect the metaprogramming API to be stable enough now with M3 to start implementing against it? What are your plans? Thanks.
Jonathan Ostrander
@jonathan-ostrander
Hey all, I just started diving into using Chimney for handling transforming ScalaPB models. Using Transformer.define[ProtobufModel, Model].enableUnsafeOption.buildTransformer works fine, but I would like to use TransformerF for handling transformation errors in a nicer way (and it's also recommended by the docs), but I'm failing how to get a TransformerF[EitherStringVec, FooProto, Foo] for something like FooProto(bar: Option[String], baz: Option[Int]) and Foo(bar: String, baz: Int) without explicitly doing the check for None for each field.
Barrie McGuire
@barriem
Hi @krzemin - sorry if this one has come up a lot before. We were wondering about the reasons behind the covariance in TransformerFSupport? and whether there are any plans to relax it ? We're wanting to use the .intoF functionality in quite a deep nested layer with an invariant higher kinded type F[_] - which we use heavily with cats / cats-effect type classes. Changing this to be covariant everywhere would be a bit difficult. Thanks very much!
Yisrael Union
@yisraelU

is there a way to specify a prefix for a field and achieve automatic transformation. i.e.

case class X(i:Int)
case class Y(pre_i:Int)
X(1).transformInto[Y]

would like to be able to specify somewhere that "pre_" is a prefix

Roger
@rogern
Hi, I wonder if scalapb.GeneratedEnum is supported and in that case how to configure chimney to handle it? I'm not sure but maybe it doesn't work because scalaPB as default makes enums into a sealed abstract class rather than a sealed trait? Anyway, with a couple of custom transformers it works out but it would be nice to handle it in a more general way. Thanks!
1 reply
Denis Mikhaylov
@notxcain
Hi! Did understand right that conversion between two Java Enums is not supported yet?
5 replies
Dmytro Kostiuchenko
@edio

My question is not strictly related to chimney as is. As I mentioned in a message above, we maintain our own chimney fork. The point is to have support for Java enums and Scala enumerations among other things.

It was quite easy to add support for Java enums, because every enum value is a subtype of the enum declaration, we just needed few changes in the existing code that supports sealed hierarchies.

With Scala enumerations it is quite different: every enumeration is of the same type. And I'm trying to reuse the same code, that already supports sealed hierarchies (to get .withCoproductInstance almost for free).

I used a trick with ConstantType, where an arbitrary value (?) can be expressed as a unique type:

val c: blackbox.Context
val s: MethodSymbol = ???

c.internal.constantType(Constant(s)) // this gives me a UniqueConstantType instance

However, my code fails to compile now with an error:

type arguments [<notype>(SkyBlue),richcolors.ShadesOfGrey.Color,io.scalaland.chimney.internal.TransformerCfg.CoproductInstance[<notype>(SeafoamGreen),richcolors.ShadesOfGrey.Color,io.scalaland.chimney.internal.TransformerCfg.CoproductInstance[<notype>(SalmonRed),richcolors.ShadesOfGrey.Color,io.scalaland.chimney.internal.TransformerCfg.Empty]]] do not conform to class CoproductInstance's type parameter bounds [InstType,TargetType,C <: io.scalaland.chimney.internal.TransformerCfg]
[error]                 .withCoproductInstance(CamelCase.SkyBlue, ShadesOfGrey.Grey)
...
[info] <notype>(SkyBlue) <: Any?
[info] false
...

where <notype> seems to be a (quite weird) String representation of the newly constructed type, and SkyBlue is my symbol that represents the enumeration value.

How is it possible, that something is not a subtype of Any?

Or is the message a red herring?

Would highly appreciate any help and suggestions!

2 replies
Dmytro Kostiuchenko
@edio
@krzemin , is there a way to make some tests in chimney only be executed when building for jvm? I'm working on that java enums PR and, naturally, java enums do not play nicely with scala.js
[info] Fast optimizing /home/dmytroko/develop/chimney-oss/chimney/.js/target/scala-2.13/chimney-test-fastopt
[error] Referring to non-existent class io.scalaland.chimney.examples.JavaEnums$Colors6
4 replies
wookievx
@wookievx

Hello every one I answered question regarding scala 3.0 migration:
scalalandio/chimney#201
I have a working POC of product support (it is a long way to deliver all functionality of chimney, for particular handling methods and java bean will require diving deep into Tasty Reflection, emit descriptive error messages when derivation fails), but some of the functionality is there:

  • support for products (coproducts should be relatively easy to add)
  • overriding values (computing and constant, adding renaming should be straight-forward)
  • using default values from target type

I am planning to extend current code to handle recursive derivation.
As I described in the issue, I am trying to avoid tasty reflection where possible and so far I have bean successful, unfortunately this require some tedious type parameters passing (which could be circumvented to some degree and might be something to do early to make derivation code comprehensible). Here is current implementation:
https://github.com/wookievx/domain-slices/tree/chimney-like-poc

2 replies
Karel Fajkus
@Peppi-Ressler

Hi guys I've encountered this issue https://github.com/scalalandio/chimney/blob/master/chimney/src/main/scala/io/scalaland/chimney/internal/utils/MacroUtils.scala#L303 with chimney 0.6.1 and scala 2.13.3. Currently I am not even sure what exactly caused the issue but when I've made "sub-transformations" explicit via withFieldComputed it works without issue. I would need a little help to even identify what is wrong on my side to create ticket with reproducible error.

final case class A(input: Something) extends AnyVal with ApiClass[Something]

object A extends ApiClassOps[Something, A]

trait ApiClass[A] extends Any {
  def input: A
}

trait ApiClassOps[A, B <: ApiClass[A]] {
  implicit def apiClassTransformer[To](implicit transformer: Transformer[A, To]): Transformer[B, To] = from => transformer.transform(from.input)
}

I think the main issue will be somewhere in that code as before this I had no issues with chimney at all. Thanks for any help