import cats.data.ValidatedNec
import cats.implicits._
import io.scalaland.chimney.dsl._
import io.scalaland.chimney.cats._
Example("Hello", 1)
.into[Example]
.withFieldComputedF[VN, String, String](_.a, e => (e.a + "!").invalidNec[String])
.transform
// Valid(Example("Hello", 1)) <-- :(
Hi!
Probably a stupid question. I tried to write a generic implicit transformer for cats NonEmptyVector
and so far derivation doesn't pick it up for me.
The transformer looks like this:
implicit def nevToVectorTransformer[T, S](implicit
elemTransformer: Transformer[T, S],
): Transformer[NonEmptyVector[T], Vector[S]] = (src: NonEmptyVector[T]) =>
src.toVector.map(elemTransformer.transform)
implicit def vectorToNevTransformer[T, S](implicit
elemTransformer: Transformer[T, S],
): Transformer[Vector[T], NonEmptyVector[S]] = (src: Vector[T]) =>
NonEmptyVector.fromVectorUnsafe(src.map(elemTransformer.transform))
I've even pulled element transformers "explicitly" into implicit scope, still no luck. Is there some kind of limitation in the macro derivation related to this?
For now you need to have concrete instances as implicit vals/defs available in the implicit scope
So I need to have all required elemTransformer
implicit instances defined explicitly in the scope, right?
Somehow it didn't work in my real code. At the same time when I tried to come up with a minimal example, I couldn't make it fail. I will dig dipper, maybe the issue is somewhere else.
My real code has those non-empty vectors nested inside a sealed trait hierarchy, with at least 2 levels of nesting.
Hello!
I've got a question about a usecase I'm trying to solve but I'm not sure it's doable. I'd like to merge two case class
into one, something like:
case class A(a: String)
case class B(b: String)
case class C(a: String, b: String)
val a = A("a")
val b = B("b")
// I'd like to merge a and b into an instance of C, something like this:
// val c = (a, b).transformInto[C]
// expected: C("a", "b")
I managed to make it work by making some fields optional in C
and using patch
but it feels hacky and I'd like to keep the same signatures as above:
case class A(a: String)
case class B(b: String)
case class C(a: String, b: Option[String]) // b is now an Option
val a = A("a")
val b = B("b")
val c = a.into[C].enableOptionDefaultsToNone.transform.patchUsing(b)
// works: C("a", Some("b"))
Would someone know if what I want to achieve is doable? Thanks in advance :)
Hello!
I have an issue while using Chimney in 2.13.3 Scala project (SBT 1.4.x).
As in mentioned project scala-library
is provided in runtime, I exclude it from the package. Unfortunately after adding Chimeny, the autoScalaLibrary := false
setting is ignored and Scala std lib gets added increasing the package by 5MB.
Is it something what might be considered as Chimney library's bug?
For the time being is there a better way to workaround this issue than adding Chimney dependency like below?
If not, is there any risk of doing the following?
libraryDependencies += "io.scalaland" %% "chimney" % "0.6.1" excludeAll(
ExclusionRule(organization = "org.scala-lang"),
ExclusionRule(organization = "org.scala-lang.modules")
)
I will add reproduction steps to above.
autoScalaLibrary := false
, I build the package with sbt clean assembly
.scala-library
is not included as expected."io.scalaland" %% "chimney"
dependency.scala-library
is included ignoring autoScalaLibrary
setting.Is there a better solution for it thanExclusionRule
? Shall I file an issue for this in Github?
Hello!
I've been trying to abstract some repetition using a function that looks like this:
type BasePropertyTransformer[T, C <: TransformerCfg] = TransformerInto[T, BaseProperty, C, TransformerFlags.Default]
def transformLocationFields[T, C <: TransformerCfg](latitude: Option[BigDecimal],
longitude: Option[BigDecimal],
incRepresentation: Option[String],
adm3Name: Option[String],
adm2Name: Option[String],
adm1Name: Option[String],
postalCodeValue: Option[String],
addressRepr: Option[String])
(tr: BasePropertyTransformer[T, C]): BaseProperty = {
val point = createPoint(latitude, longitude)
tr.withFieldConst(_.inc_representation, incRepresentation)
.withFieldConst(_.adm_3_name, adm3Name)
.withFieldConst(_.adm_2_name, adm2Name)
.withFieldConst(_.adm_1_name, adm1Name)
.withFieldConst(_.postal_code_value, postalCodeValue)
.withFieldConst(_.address_repr, addressRepr)
.withFieldConst(_.latitude, latitude)
.withFieldConst(_.longitude, longitude)
.withFieldConst(_.property_geography, point.map(p => WKBWriter.toHex(wkbWriter.write(p))))
.transform
}
Which I then would like to use multiple times in other parts of the codebase:
val propertyTransformer =
prop.into[BaseProperty]
.withFieldConst(_.source_name, "fa")
.withFieldRenamed(_.id, _.iqs_id)
.withFieldComputed(_.property_id, p => SourceRecord.decimalFormat.format(p.property_id))
.withFieldComputed(_.renovation_year, p => Option(p.renovation_year).map(_.toInt))
.withFieldConst(_.gross_floor_area, grossFloorArea)
transformLocationFields(latitude, longitude, incRepresentation, adm3name, adm2name, adm1name, postalCodeValue, addressRepr)(propertyTransformer)
But I get an error like this
[error] BaseProperty.scala:113:8: Bad internal transformer config type shape!
[error] .transform
[error] ^
Do you know if this is possible or if I'm doing something wrong?
I was indeed not using the latest version. However, that did not make the error go away. I tried letting the compiler infer the types and that seemed to do the trick :)
Now I have something like this:
def transformLocationFields[T](from: T,
latitude: Option[BigDecimal],
longitude: Option[BigDecimal],
incRepresentation: Option[String],
adm3Name: Option[String],
adm2Name: Option[String],
adm1Name: Option[String],
postalCodeValue: Option[String],
addressRepr: Option[String]) = {
val point = createPoint(latitude, longitude)
from.into[BaseProperty]
.withFieldConst(_.inc_representation, incRepresentation)
.withFieldConst(_.adm_3_name, adm3Name)
.withFieldConst(_.adm_2_name, adm2Name)
.withFieldConst(_.adm_1_name, adm1Name)
.withFieldConst(_.postal_code_value, postalCodeValue)
.withFieldConst(_.address_repr, addressRepr)
.withFieldConst(_.latitude, latitude)
.withFieldConst(_.longitude, longitude)
.withFieldConst(_.property_geography, point.map(p => WKBWriter.toHex(wkbWriter.write(p))))
}
And then when using it I do this:
transformLocationFields(prop, latitude, longitude, incRepresentation, adm3name, adm2name, adm1name, postalCodeValue, addressRepr)
.withFieldConst(_.source_name, "fa")
.withFieldRenamed(_.id, _.iqs_id)
.withFieldComputed(_.property_id, p => SourceRecord.decimalFormat.format(p.property_id))
.withFieldComputed(_.renovation_year, p => Option(p.renovation_year).map(_.toInt))
.withFieldConst(_.gross_floor_area, grossFloorArea)
.transform
after adding Chimeny, the autoScalaLibrary := false setting is ignored
Hi, no response for the issue I reported. I think I see what might be the root cause.
The same happens if I add scala-collection-compat
and this dependency is used by Chimney.
However, I see interesting information from @krzemin : "I'm going to drop support for Scala 2.11 in the next Chimney version. [...] Does anyone have any strong argument against it?".
Excellent. For my side one more reason to uphold this decision. Is it literally planned for the next 0.6.2 release?
autoScalaLibrary
- if that happens to be Chimney issue, I'm happy to merge PR.scala-collection-compat
as it's required for easy 2.12/2.13 interop.
ExclusionRule
or the other way if I find. Btw.I'm happy to see the discussion on Scala 3 transition. One point (2.11 drop) from that discussion is done already :smirk: (https://gitter.im/scalalandio/chimney?at=5f1ec760b1409c060f887361)sealed trait
hierarchies A
and B
. When I want to go from A => B
, I can use transforInto
. Most of the concrete logic looks the same, but I want to inject logic into one of the concrete type transformations. How would I go about doing that? E.g. https://scastie.scala-lang.org/andyczerwonka/qkeYbqp8T4aWJWZPGOoEoA/6