Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 13:50

    plokhotnyuk on v2.13.23

    (compare)

  • 13:50

    plokhotnyuk on master

    Setting version to 2.13.23 Setting version to 2.13.24-SNAP… (compare)

  • 12:46

    plokhotnyuk on master

    More efficient serialization of… (compare)

  • 10:56

    plokhotnyuk on master

    More efficient serialization of… (compare)

  • 10:50

    plokhotnyuk on master

    More efficient serialization of… (compare)

  • May 17 11:12

    plokhotnyuk on master

    More efficient parsing of doubl… (compare)

  • May 16 14:31

    plokhotnyuk on master

    Code clean up (compare)

  • May 16 14:00

    plokhotnyuk on master

    More efficient parsing of doubl… (compare)

  • May 16 11:12

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 10:16

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 10:03

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 09:50

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 09:17

    plokhotnyuk on master

    Clean up of code (compare)

  • May 15 08:49

    plokhotnyuk on master

    Update jackson-module-scala to … (compare)

  • May 15 08:49
    plokhotnyuk closed #899
  • May 15 07:32
    scala-steward opened #899
  • May 15 03:01

    plokhotnyuk on master

    Update jackson-module-afterburn… (compare)

  • May 15 03:01
    plokhotnyuk closed #898
  • May 15 03:01

    plokhotnyuk on master

    Update jackson-datatype-jdk8 to… (compare)

  • May 15 03:01
    plokhotnyuk closed #897
Glen Marchesani
@fizzy33
seems like some extends AnyVal type around an Array[Byte] to give you the data of the array but not have it be mutable would be really nice. I wonder if someone has done the work on such a thing
and thanks for sharing your thoughts on this.
Marcin Szałomski
@baldram

Currently I'm going to check if it will be possible to minimize derivation configuration by adding the support for Enumeratum in some additional sub-project or even in the jsoniter-scala-macros sub-project with an optionally scoped dependency on Enumeratum.
While EnumEntry from Enumeratum is supported [...]

@plokhotnyuk Hi Andriy! Have you had a chance to evaluate creating the Enumeratum support related jsoniter-scala-macros sub-project? Is it something in your current work scope and doable based on the existing work from the past (https://github.com/lloydmeta/enumeratum/pull/176/files)?

Andriy Plokhotnyuk
@plokhotnyuk
@baldram Hi, Marcin! Thank for your question and support! I think that support of value enums of Enumeratum can be added with test-only scoped dependency on Enumeratum. Just need to add the CodecMakerConfig.useEnumeratumEnumValueId flag that turns on using of value from value fields as ids for parsing and serialization of objects which extends sealed traits as it was done for Scala enums here: plokhotnyuk/jsoniter-scala@f37dde6 Also, it can bring maximum performance because Enumeratum API will not be used at all.
Andriy Plokhotnyuk
@plokhotnyuk
I've added tests for Enumeratum enums: plokhotnyuk/jsoniter-scala@73ce8aa
Marcin Szałomski
@baldram
Oh, wow! What an answer :) Just seeing commits. Thank you.
This will be a part of 2.8.2 I guess. Is it already scheduled? I would try out a new feature.
Marcin Szałomski
@baldram

@plokhotnyuk I have one more question Andriy. Is it possible to add to generated code a SuppressWarnings? Maybe as an option?
The macro causes a lot of Wartremover errors. So we need to ignore it like below.

@SuppressWarnings(Array("org.wartremover.warts.All"))

The problem is that it doesn't make sense to create a single file, let's say Codecs.scala, put all codecs there, and add wartremoverExcluded for the whole file from build.sbt.
In this case, we would have to import codecs manually, losing the advantage of implicit codec.
If we have a codec inside a companion object, this imports automatically and also, other features relying on this codec are working automatically.

So everywhere in code we add this linter ignore for JsonCodecMaker.make. This is in each place we use Jsoniter.

object SomeFooBarEvent {
  @SuppressWarnings(Array("org.wartremover.warts.All")) // silencing generated code false-positive issues
  implicit val codec: JsonValueCodec[SomeFooBarEvent] = JsonCodecMaker.make
}

final case class SomeFooBarEvent(
  metadata: FooBarMetadata,
  request: FooBarRequest
)

Maybe such a SuppressWarnings could be already generated or with some option?
Or there is another solution?

Andriy Plokhotnyuk
@plokhotnyuk
@baldram Could you please share the code and commands to reproduce your warnings? Probably they worth to be fixed immediately in the macro that generate implementations of codecs. In the v2.7.1 release I fixed the "match may not be exhaustive" warning and it is possible that others are easy to fix too.
Marcin Szałomski
@baldram
The above code is enough. This already causes warnings if there is no @SuppressWarnings. Any use of JsonCodecMaker.make for any hello world case class caused tons of errors, when used with Wartremover wartremoverErrors ++= Warts.all.
Then enough is to dosbt clean compile from the console and that's it. Shall I provide some example warnings?
Marcin Szałomski
@baldram

I'm not sure it's easy to fix those warnings as "null" is a thing which is prohibited or lack of triple equals and many things.
Here is example:

object FooBar {
  implicit val codec: JsonValueCodec[FooBar] = JsonCodecMaker.make
}
final case class FooBar(foo: Int, bar: Option[String])

...causes:

[warn] /mnt/foobar/model/FooBar.scala:11:63: [wartremover:Equals] != is disabled - use =/= or equivalent instead
[warn]   implicit val codec: JsonValueCodec[FooBar] = JsonCodecMaker.make
Now I will disable errors one by one. Eg. @SuppressWarnings(Array("org.wartremover.warts.Equals")). Then next will come, and next, and next...
So just after I get:
[warn] /mnt/foobar/model/FooBar.scala:12:63: [wartremover:Null] null is disabled
Marcin Szałomski
@baldram

So now I do:
@SuppressWarnings(Array("org.wartremover.warts.Equals", "org.wartremover.warts.Null"))
... and I get:

[warn] /mnt/foobar/model/FooBar.scala:12:63: [wartremover:OptionPartial] Option#get is disabled - use Option#fold instead

It looks like never ending story, so I end up in adding @SuppressWarnings(Array("org.wartremover.warts.All")).
That's why it would be great to already generate this SuppressWarnings, as this is kind of not nice thing we need to add in a code and excuse ;) for it in a comment.

If not generating Supress Warnings, I was thinking also how about implementing some custom annotation alias for Jsoniter like @SupressJsoniterWarts which would have this suppress for All errors under the hood.
Maybe also a solution. So adding this short version of Supress Warnings (@SupressJsoniterWarts) would be at least a bit nicer.
The best way would be to have it already generated, or have some configuration in build.sbt or whatever while importing a library that this will generate those @SuppressWarnings(Array("org.wartremover.warts.All")). Not sure this is possible.

Marcin Szałomski
@baldram
Is my description of a problem clear enough?
Andriy Plokhotnyuk
@plokhotnyuk
Thank you, Marcin! It was clear and helpful! Here is a commit with the proposed solution: plokhotnyuk/jsoniter-scala@a550237
Could you please clone the repo with git clone --depth 1000 --branch master git@github.com:plokhotnyuk/jsoniter-scala.git, build and publish locally a snapshot version with sbt clean publishLocal and then test the 2.8.2-SNAPSHOT version if it works for you?
Marcin Szałomski
@baldram
All right. I will do that in the evening, is it ok?
Andriy Plokhotnyuk
@plokhotnyuk
Ok, I've force pushed a better solution that passes internal tests with polluted namespaces: plokhotnyuk/jsoniter-scala@fa8b53f
Marcin Szałomski
@baldram
I've compiled a project with 2.8.2-SNAPSHOT with all SuppressWarnings removed and no linter errors :tada:
I will try in the other project to double check.
Marcin Szałomski
@baldram
@plokhotnyuk I've tried this in another project. What is important, after I bumped a version, all works without changes. Next, when removed linter ignores, I see no warnings.
All tests are passing as well, btw.
I also checked what happens if I make an issue next to codec definition and it is handled correctly by Wartremover.
Thank you Andriy!
PS: I haven't played yet with Enumeratum, but I think I will try to modify some code in a project for testing purposes tomorrow.
Andriy Plokhotnyuk
@plokhotnyuk
@baldram Thank you a lot for reporting and testing the issue with Wartremover! I've published the fix in v2.8.2 release.
Marcin Szałomski
@baldram
Thank you Andriy!
Marcin Szałomski
@baldram
@plokhotnyuk I have one more question Andriy, regarding commits about Scala 3, like "Update Scala 3.x to 3.0.0".
Is there a plan to make jsoniter-scala Scala 3 compatible? I thought it's not doable because of macros, and there is no hope for that.
However, seeing commits treating Scala 3... That would be so fabulous.
Andriy Plokhotnyuk
@plokhotnyuk
I've tried to publish jsoniter-scala-core with 3.0.0-RC3 and 3.0.0 versions but failed both attempts due some classpath problems with the sonatypeBundleRelease command: xerial/sbt-sonatype#230
Marcin Szałomski
@baldram
Sounds exciting! Does it mean that even though all those Scala 3 macro related difficulties, those moves are about trying to use Jsoniter with Scala 3, and it's close anyway?
Chris Kipp
@ckipp:matrix.org
[m]
I'm not fully sure how to actually phrase this questions, but I'm working on migrating the Build Server Protocol off of circe to jsoniter and we have some case classes that represent requests in the spec that we made codecs for, but some of these requests allow for the value that is returned to be null. However in that scenario (at least I think that's what happening) jsoniter will freak out because I believe it wants valid json, but is instead receiving null. And then you're met with this
[info]   org.eclipse.lsp4j.jsonrpc.ResponseErrorException: com.github.plokhotnyuk.jsoniter_scala.core.JsonReaderException: expected '{', offset: 0x00000000, buf:
[info] +----------+-------------------------------------------------+------------------+
[info] |          |  0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f | 0123456789abcdef |
[info] +----------+-------------------------------------------------+------------------+
[info] | 00000000 | 6e 75 6c 6c                                     | null             |
[info] +----------+-------------------------------------------------+------------------+

the way we are doing these requests are like

case class Reload()

object Reload {
  implicit val codec: JsonValueCodec[Reload] =
    JsonCodecMaker.make
}

But is there another way to do this that would allow for null?

Andriy Plokhotnyuk
@plokhotnyuk
@ckipp:matrix.org Hi Chris! Great news! A good option would be using of Option[Reload] instead, so null JSON values will be parsed to the None object reference. But if it is not an option for your than some delegating custom codec can be created to wrap derived one by the standard make:
  def wrapByNullAllowingCodec[A](codec: JsonValueCodec[A]): JsonValueCodec[A] = new JsonValueCodec[A] {
    override def decodeValue(in: JsonReader, default: A): A =
      if (in.isNextToken('n')) {
        in.readNullOrError[String]("x", "expected value or null")
        nullValue
      } else {
        in.rollbackToken()
        codec.decodeValue(in, default)
      }

    override def encodeValue(x: A, out: JsonWriter): Unit =
      if (x == null) out.writeNull()
      else codec.encodeValue(x, out)

    override val nullValue: A = null.asInstanceOf[A]
  }
Chris Kipp
@ckipp:matrix.org
[m]
thanks for the response! Ah, good to see your wrapByNullAllowingCodec I was actually just going down that path so it's good to see that I was headed in the right direction
Chris Kipp
@ckipp:matrix.org
[m]
cool, this worked perfectly. Thank you!
Andriy Plokhotnyuk
@plokhotnyuk
You are welcome, Chris! I'll switch to BSP definitely, so waiting for the completion of your migration with impatience!
Steven Lai
@steven-lai

I was wondering if I had:

case class(id: String, fields: Map[String, String], num: Int)

// How can I read in the following JSON:
{
  "id" : "FOO", 
  "num": 456,
  "fields" : {
    "time" : 123,
    "bar" : "baz"
  }
}

I guess a custom codec? The Stringified annotation or config doesn't seem to apply here

Steven Lai
@steven-lai
I ended up doing:
final case class StringMapValue(underlying: String) extends AnyVal
final case class(id: String, fields: Map[String, StringMapValue], num: Int)

implicit val mapValueCodec: JsonValueCodec[StringMapValue] = new JsonValueCodec[StringMapValue]() {
  override def decodeValue(in: JsonReader, default: StringMapValue): StringMapValue = {
    val b = in.nextToken()
    if (b == '"') {
      in.rollbackToken()
      StringMapValue(in.readString(null))
    } else if ((b >= '0' || b <= '9') || b == '-') {
      in.rollbackToken()
      val d = in.readDouble()
      val i = d.toInt
      if (i.toDouble == d) {
        StringMapValue(i.toString)
      } else {
        StringMapValue(d.toString)
      }
    } else if (b == 'n') {
      in.rollbackToken()
      in.readNullOrError(null, "expected `null` value")
      nullValue
    } else if (b == 't' || b == 'f') {
      in.rollbackToken()
      StringMapValue(in.readBoolean().toString)
    } else {
      in.decodeError("expected JSON value")
    }
  }

  override def encodeValue(x: StringMapValue, out: JsonWriter): Unit = {
    out.writeVal(x.underlying)
  }

  override val nullValue: StringMapValue = StringMapValue(null)
}
Andriy Plokhotnyuk
@plokhotnyuk
@steven-lai Hi, Steven! Yes, a custom codec could be a solution here. Could you please explain how values of the fields field will be used? Do you need encoding the parsed JSON back to the same representation?
Steven Lai
@steven-lai
I don't believe I do, but if I did how would I achieve that? The fields map is just a generic map for some non-fixed fields
Andriy Plokhotnyuk
@plokhotnyuk
@steven-lai Currently your codec encode all fields values (except null) as strings and then will not able to parse them. For parsed nulls it will just throw NPEs during encoding. Also it have a couple of bug in parsing of numbers and null values. To avoid such problems you can define an ADT (or sum type) for fields values and use the following codec:
import com.github.plokhotnyuk.jsoniter_scala.macros._
import com.github.plokhotnyuk.jsoniter_scala.core._

object Example01 {
  sealed trait MapValue
  case class StringMapValue(value: String) extends MapValue
  case class BooleanMapValue(value: Boolean) extends MapValue
  case class DoubleMapValue(value: Double) extends MapValue
  case class IntMapValue(value: Int) extends MapValue
  case object NullMapValue extends MapValue

  case class Data(id: String, fields: Map[String, MapValue], num: Int)

  implicit val mapValueCodec: JsonValueCodec[MapValue] = new JsonValueCodec[MapValue] {
    override def decodeValue(in: JsonReader, default: MapValue): MapValue = {
      val b = in.nextToken()
      if (b == '"') {
        in.rollbackToken()
        StringMapValue(in.readString(null))
      } else if (b >= '0' && b <= '9' || b == '-') {
        in.rollbackToken()
        val d = in.readDouble()
        val i = d.toInt
        if (i.toDouble == d) IntMapValue(i)
        else DoubleMapValue(d)
      } else if (b == 't' || b == 'f') {
        in.rollbackToken()
        BooleanMapValue(in.readBoolean())
      } else if (b == 'n') {
        in.readNullOrError(default, "expected `null` value")
        NullMapValue
      } else in.decodeError("expected JSON value")
    }

    override def encodeValue(x: MapValue, out: JsonWriter): Unit = x match {
      case NullMapValue => out.writeNull()
      case StringMapValue(x) => out.writeVal(x)
      case BooleanMapValue(x) => out.writeVal(x)
      case IntMapValue(x) => out.writeVal(x)
      case DoubleMapValue(x) => out.writeVal(x)
    }

    override def nullValue: MapValue = NullMapValue
  }

  implicit val codec: JsonValueCodec[Data] = JsonCodecMaker.make

  def main(args: Array[String]): Unit = {
    val data = readFromArray[Data]("""{
                               |  "id" : "FOO",
                               |  "num": 456,
                               |  "fields" : {
                               |    "time" : 123,
                               |    "bar" : "baz",
                               |    "bool" : true,
                               |    "double" : 123.456890123,
                               |    "null" : null
                               |  }
                               |}""".stripMargin.getBytes("UTF-8"))
    val json = writeToArray(Data(id = "BAR", num = 123,
      fields = Map[String, MapValue]("time" -> IntMapValue(123), "bar" -> StringMapValue("baz"))))

    println(data)
    println(new String(json, "UTF-8"))
  }
}
Andriy Plokhotnyuk
@plokhotnyuk
If your fields values can be arbitrary JSON values and they could be mutable then I would recommend to use Dijon's SomeJson for them that uses jsoniter-scala's codec to parse and encode it.
Another option if fields values not need to be handled and just need to be passed back to encoded JSON then for the fields field use RawVal type like here.
Steven Lai
@steven-lai
Thanks!
Dermot Haughey
@hderms
@plokhotnyuk do you have any tips for data structures that jsoniter will work better with? I'm thinking potentially Array will be marginally faster to deserialize than List, for example
Andriy Plokhotnyuk
@plokhotnyuk
@hderms Hi Dermot! Currently jsoniter-scala-macros API supports Array and all other Scala collections from the standard library, but any collection can be parsed with a custom codec that uses jsoniter-scala-core API. Throughput of collection parsing depends a lot on how fast an instance of the collection can be created. On the page with benchmark results you check results for ArrayBufferOfBooleansReading, ArrayOfBooleansReading, ListOfBooleansReading and VectorOfBooleansReading to have a rough estimation. For more concrete results you can modify and run corresponding benchmarks on your environment.
Zhenhao Li
@Zhen-hao
hi, has anyone used this library with the Alpakka Elasticsearch connector? the doc, https://doc.akka.io/docs/alpakka/current/elasticsearch.html#with-typed-source, says "The data is converted to and from JSON by Spray JSON." it seems to me a tight coupling with Spray JSON
Objektwerks
@objektwerks
@plokhotnyuk Hey, Andriy, I'm thinking of switching from Akka-Http-Upickle to Akka-Http-Jsoniter for 2 reasons: 1) uPickle requites a @nowarn annotation to work with Scala 2.13.6 due to the new and improved exhaustive pattern matching; and 2) Intellij can't understand the Akka-Http marshallers and complete directive for uPickle support. I'm hoping, beyond a major speed bump, that Akka-Http-Jsoniter support will work error-free in Intellij. I know, an impossible task, perhaps.:) Thoughts? Thanks in advance!
Andriy Plokhotnyuk
@plokhotnyuk
Hello, @objektwerks ! Thanks for trying akka-http-jsoniter-scala. I think it is quite possible to switch on it from akka-http-upickle easily. Please let me know If you will require porting of some compile-time/runtime feature from uPickle, writing of efficient custom codecs, fixing/suppressing of some compile-time warnings, etc.
Objektwerks
@objektwerks
Thanks, @plokhotnyuk Andriy. It all seems to work with Akka-Http. Yet I have 3 unique case classes with the identical structure, and Jsoniter is parsing them all as the 1st case class, which produces akka-http router -> database integrity errors. I was looking for a way to have Jsoniter provided a case class discriminator key in the json. Then I was hoping to examine the Jsonitor-generated json and not finding a convenient method to do so ( beyond a series of numbers ). So I haven't quite reached the promise land. :)
Objektwerks
@objektwerks
@plokhotnyuk Okay, I figured out the byte array to json technique: json.map(_.toChar).mkString and no case class discriminator is provided. And I'm not sure how to enable that feature. I noticed a macro make method that disables, it though.:)