Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • May 17 11:12

    plokhotnyuk on master

    More efficient parsing of doubl… (compare)

  • May 16 14:31

    plokhotnyuk on master

    Code clean up (compare)

  • May 16 14:00

    plokhotnyuk on master

    More efficient parsing of doubl… (compare)

  • May 16 11:12

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 10:16

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 10:03

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 09:50

    plokhotnyuk on master

    Clean up of code (compare)

  • May 16 09:17

    plokhotnyuk on master

    Clean up of code (compare)

  • May 15 08:49

    plokhotnyuk on master

    Update jackson-module-scala to … (compare)

  • May 15 08:49
    plokhotnyuk closed #899
  • May 15 07:32
    scala-steward opened #899
  • May 15 03:01

    plokhotnyuk on master

    Update jackson-module-afterburn… (compare)

  • May 15 03:01
    plokhotnyuk closed #898
  • May 15 03:01

    plokhotnyuk on master

    Update jackson-datatype-jdk8 to… (compare)

  • May 15 03:01
    plokhotnyuk closed #897
  • May 14 22:45
    scala-steward opened #898
  • May 14 22:45
    scala-steward opened #897
  • May 14 15:49

    plokhotnyuk on gh-pages

    Update JVM results for v2.13.22 (compare)

  • May 13 14:06

    plokhotnyuk on v2.13.22

    (compare)

  • May 13 14:06

    plokhotnyuk on master

    Setting version to 2.13.22 Setting version to 2.13.23-SNAP… (compare)

Andriy Plokhotnyuk
@plokhotnyuk
@steven-lai Currently your codec encode all fields values (except null) as strings and then will not able to parse them. For parsed nulls it will just throw NPEs during encoding. Also it have a couple of bug in parsing of numbers and null values. To avoid such problems you can define an ADT (or sum type) for fields values and use the following codec:
import com.github.plokhotnyuk.jsoniter_scala.macros._
import com.github.plokhotnyuk.jsoniter_scala.core._

object Example01 {
  sealed trait MapValue
  case class StringMapValue(value: String) extends MapValue
  case class BooleanMapValue(value: Boolean) extends MapValue
  case class DoubleMapValue(value: Double) extends MapValue
  case class IntMapValue(value: Int) extends MapValue
  case object NullMapValue extends MapValue

  case class Data(id: String, fields: Map[String, MapValue], num: Int)

  implicit val mapValueCodec: JsonValueCodec[MapValue] = new JsonValueCodec[MapValue] {
    override def decodeValue(in: JsonReader, default: MapValue): MapValue = {
      val b = in.nextToken()
      if (b == '"') {
        in.rollbackToken()
        StringMapValue(in.readString(null))
      } else if (b >= '0' && b <= '9' || b == '-') {
        in.rollbackToken()
        val d = in.readDouble()
        val i = d.toInt
        if (i.toDouble == d) IntMapValue(i)
        else DoubleMapValue(d)
      } else if (b == 't' || b == 'f') {
        in.rollbackToken()
        BooleanMapValue(in.readBoolean())
      } else if (b == 'n') {
        in.readNullOrError(default, "expected `null` value")
        NullMapValue
      } else in.decodeError("expected JSON value")
    }

    override def encodeValue(x: MapValue, out: JsonWriter): Unit = x match {
      case NullMapValue => out.writeNull()
      case StringMapValue(x) => out.writeVal(x)
      case BooleanMapValue(x) => out.writeVal(x)
      case IntMapValue(x) => out.writeVal(x)
      case DoubleMapValue(x) => out.writeVal(x)
    }

    override def nullValue: MapValue = NullMapValue
  }

  implicit val codec: JsonValueCodec[Data] = JsonCodecMaker.make

  def main(args: Array[String]): Unit = {
    val data = readFromArray[Data]("""{
                               |  "id" : "FOO",
                               |  "num": 456,
                               |  "fields" : {
                               |    "time" : 123,
                               |    "bar" : "baz",
                               |    "bool" : true,
                               |    "double" : 123.456890123,
                               |    "null" : null
                               |  }
                               |}""".stripMargin.getBytes("UTF-8"))
    val json = writeToArray(Data(id = "BAR", num = 123,
      fields = Map[String, MapValue]("time" -> IntMapValue(123), "bar" -> StringMapValue("baz"))))

    println(data)
    println(new String(json, "UTF-8"))
  }
}
Andriy Plokhotnyuk
@plokhotnyuk
If your fields values can be arbitrary JSON values and they could be mutable then I would recommend to use Dijon's SomeJson for them that uses jsoniter-scala's codec to parse and encode it.
Another option if fields values not need to be handled and just need to be passed back to encoded JSON then for the fields field use RawVal type like here.
Steven Lai
@steven-lai
Thanks!
Dermot Haughey
@hderms
@plokhotnyuk do you have any tips for data structures that jsoniter will work better with? I'm thinking potentially Array will be marginally faster to deserialize than List, for example
Andriy Plokhotnyuk
@plokhotnyuk
@hderms Hi Dermot! Currently jsoniter-scala-macros API supports Array and all other Scala collections from the standard library, but any collection can be parsed with a custom codec that uses jsoniter-scala-core API. Throughput of collection parsing depends a lot on how fast an instance of the collection can be created. On the page with benchmark results you check results for ArrayBufferOfBooleansReading, ArrayOfBooleansReading, ListOfBooleansReading and VectorOfBooleansReading to have a rough estimation. For more concrete results you can modify and run corresponding benchmarks on your environment.
Zhenhao Li
@Zhen-hao
hi, has anyone used this library with the Alpakka Elasticsearch connector? the doc, https://doc.akka.io/docs/alpakka/current/elasticsearch.html#with-typed-source, says "The data is converted to and from JSON by Spray JSON." it seems to me a tight coupling with Spray JSON
Objektwerks
@objektwerks
@plokhotnyuk Hey, Andriy, I'm thinking of switching from Akka-Http-Upickle to Akka-Http-Jsoniter for 2 reasons: 1) uPickle requites a @nowarn annotation to work with Scala 2.13.6 due to the new and improved exhaustive pattern matching; and 2) Intellij can't understand the Akka-Http marshallers and complete directive for uPickle support. I'm hoping, beyond a major speed bump, that Akka-Http-Jsoniter support will work error-free in Intellij. I know, an impossible task, perhaps.:) Thoughts? Thanks in advance!
Andriy Plokhotnyuk
@plokhotnyuk
Hello, @objektwerks ! Thanks for trying akka-http-jsoniter-scala. I think it is quite possible to switch on it from akka-http-upickle easily. Please let me know If you will require porting of some compile-time/runtime feature from uPickle, writing of efficient custom codecs, fixing/suppressing of some compile-time warnings, etc.
Objektwerks
@objektwerks
Thanks, @plokhotnyuk Andriy. It all seems to work with Akka-Http. Yet I have 3 unique case classes with the identical structure, and Jsoniter is parsing them all as the 1st case class, which produces akka-http router -> database integrity errors. I was looking for a way to have Jsoniter provided a case class discriminator key in the json. Then I was hoping to examine the Jsonitor-generated json and not finding a convenient method to do so ( beyond a series of numbers ). So I haven't quite reached the promise land. :)
Objektwerks
@objektwerks
@plokhotnyuk Okay, I figured out the byte array to json technique: json.map(_.toChar).mkString and no case class discriminator is provided. And I'm not sure how to enable that feature. I noticed a macro make method that disables, it though.:)
Andriy Plokhotnyuk
@plokhotnyuk
Could you please provide some isolated code that reproduce your problem? You can share it as Scastie code snippet or as a pull request for an example project.
For example it can be parsing of some string constant with an assert between expected and actual parsed case classes.
Objektwerks
@objektwerks
@plokhotnyuk The code is rather complicated. Image 3 case classes A, B and C, each with this structure: id: Int, parentId: Int, installed: Int, model: String My Akka-Http router tests naturally tests A, B and C via their respective routers, and all effected database tables are good. This works with uPickle, for instance.
Yet it seems as though Jsoniter is interpreting all 3 case classes as case class A. And that naturally breaks the router-to-database code.
So is there some case class discriminator key feature in Jsoniter that might resolve this confusion?
Andriy Plokhotnyuk
@plokhotnyuk
Do you have separated codecs derived by the make call for all A, B and C types.
Objektwerks
@objektwerks
Yes, I do. See: https://github.com/objektwerks/pwa.pool/blob/master/shared/src/main/scala/pool/Codecs.scala
It all looks good to go. In the code above, the effected classes are Pump, Timer and Heater.
See: https://github.com/objektwerks/pwa.pool/blob/master/shared/src/main/scala/pool/Entity.scala
You'll note all 3 have the identical field names, just different case class names. Jsonitor thinks all 3 are a Pump.
And in my router test, I get 3 record entries in the pump table, nothing in the timer or heater tables.
Objektwerks
@objektwerks
@plokhotnyuk Andriy, I've gotta step out for a few hours. This is by no means a high priority issue for me. Any suggested resolutions would be awesome --- at your convenience. Cheers!
Objektwerks
@objektwerks
@plokhotnyuk Andriy, I built this unit test: https://github.com/objektwerks/pwa.pool/blob/master/shared/src/test/scala/pool/CodecsTest.scala
Note the imports for each test. Jsoniter gets confused about codecs when they're all detailed in a single object.
It looks as though I may have to place a custom codec in the companion object of each entity. Does that sound about right?
Andriy Plokhotnyuk
@plokhotnyuk
Not at the computer now. Probably the problem is that you have used implicit keyword for codec vals of Entity trait implementations.
Andriy Plokhotnyuk
@plokhotnyuk
You can define them in a different namespace to not be injected when deriving codec for Entity.
Andriy Plokhotnyuk
@plokhotnyuk
I isolated codecs for product-types from codecs for sum-types and modified your tests to show difference between these codecs, please check if it passes your end-to-end and integration tests: objektwerks/pwa.pool#1
Objektwerks
@objektwerks
@plokhotnyuk Thanks, Andriy. I re-added jsoniter support. I just got the codecs test to work correctly! The type key appears on all 3 entity tests, but no individual pump, heater or timer tests. But the router test still fails. Still the same problem, jsoniter still thinks a heater and timer are a pump.
Objektwerks
@objektwerks
jsoniter pump: Pump(0,1,1,model.pump)
jsoniter pump as json: {"poolId":1,"installed":1,"model":"model.pump"}
jsoniter pump as entity json: {"type":"Pump","poolId":1,"installed":1,"model":"model.pump"}
jsoniter heater: Heater(0,1,1,model.heater)
jsoniter heater as json: {"poolId":1,"installed":1,"model":"model.heater"}
jsoniter heater as entity json: {"type":"Heater","poolId":1,"installed":1,"model":"model.heater"}
jsoniter timer: Timer(0,1,1,model.timer)
jsoniter timer as json: {"poolId":1,"installed":1,"model":"model.timer"}
jsoniter timer as entity json: {"type":"Timer","poolId":1,"installed":1,"model":"model.timer"}
Is there a way to force the generation of a type key for pump, heater and timer json?
Objektwerks
@objektwerks
upickle pool as json: {"$type":"pool.Pool","license":"abc123","name":"pool","built":1991,"lat":26.85,"lon":82.29,"volume":10000}
This is how uPickle, for instance, does it for each of my entities.
Andriy Plokhotnyuk
@plokhotnyuk
I've updated the PR with reusing of sum-type codecs for parsing and serialization of ADT leaf types and modified the unit test to check that serialized JSON is identical:
jsoniter pump: Pump(0,1,1,model.pump)
jsoniter pump as json: {"type":"Pump","poolId":1,"installed":1,"model":"model.pump"}
jsoniter pump as entity json: {"type":"Pump","poolId":1,"installed":1,"model":"model.pump"}
jsoniter heater: Heater(0,1,1,model.heater)
jsoniter heater as json: {"type":"Heater","poolId":1,"installed":1,"model":"model.heater"}
jsoniter heater as entity json: {"type":"Heater","poolId":1,"installed":1,"model":"model.heater"}
jsoniter timer: Timer(0,1,1,model.timer)
jsoniter timer as json: {"type":"Timer","poolId":1,"installed":1,"model":"model.timer"}
jsoniter timer as entity json: {"type":"Timer","poolId":1,"installed":1,"model":"model.timer"}
Objektwerks
@objektwerks
@plokhotnyuk Thanks for your help, Andriy. I made the codec/test changes, and everything works correctly. In the router test, though, I now get the following error: ClassCastException: pool.Surface cannot be cast to pool.Pool ... also for Pump, Timer, Heater, Measurement, Cleaning, Chemical, Supply and Repair. All extend Entity, to include Pool. Pool has a 1-to-N relationship with these entities. But the casting exceptions make no sense. So, perhaps, we need to tweak the fromSumTypeCodec method. Not sure. All changes are pushed to master.
Andriy Plokhotnyuk
@plokhotnyuk
@objektwerks could you please provide steps how to setup and run the router test? Probably imports of codecs should be defined more closely for used places or codec params should be passed explicitly.
Objektwerks
@objektwerks
@plokhotnyuk Andriy, if you feel the fromSumTypeCodec method is solid, I'll play around with the router/test imports and the like. My apologies for having taken up so much of your time. FWIW, this all worked fine with Circe and uPickle. Jsoniter just need a bit more work. Thanks, again, for your help!
Andriy Plokhotnyuk
@plokhotnyuk
@objektwerks It may be wrong but I think that using of separated routes for each Entity sub-class is overkill. It looks that routing and storage can be greatly simplified using one entities route which can work polymorphically using the type key values from payload.
Objektwerks
@objektwerks
@plokhotnyuk Yes, no doubt, there's multiple ways to build Akka-Http routes. Paths vs Pattern Matching, for instance. In Scala, there's no wrong way, just many ways. :)
Nils Kilden-Pedersen
@nilskp
Once again I have a parsing failure, but not sure where it's from since stack traces are disabled by default.
First, how do I enable those again? And second, any easy way to enable this wholesale across the entire runtime, or have the default changed?
Looks like it's a very cumbersome default to override, as it's specific all the way to the individual read methods. And not implicit either. So I have to manually reference a ReaderConfig everywhere, manually. I guess that's less than optimal.
Andriy Plokhotnyuk
@plokhotnyuk
@nilskp As a w/a you can define a delegating call with an implicit parameter for the config that have your custom default settings: def read[A](buf: Array[Byte])(implicit codec: JsonValueCodec[A], config: ReaderConfig = CustomReaderConfig) = readFromArray(buf, config)(codec)
Nils Kilden-Pedersen
@nilskp
@plokhotnyuk That should work. Thanks.
b-gyula
@b-gyula
Is there an (implicit) generic transitive codec for collections like in play-json or if I want to generate json for an Array[User] then I need to create a codec implicit val codec: JsonValueCodec[Array[User]] = make regardless if I already have a codec for the User class?
Andriy Plokhotnyuk
@plokhotnyuk
Hi, @b-gyula ! Usually you need to call make only for the top level type of your data structure as described in the README example. All subsequent codecs will be automatically generated and inlined to the top level one. Defining of type class instances for other types of messages by calling make(cfg: CodecMakerConfig) could be needed if the derivation configuration should be different from the default or custom configuration used for the top level type.
Nils Kilden-Pedersen
@nilskp
Could be because arrays are reified?
quanganhtran
@quanganhtran
I can serialize case object extending sealed trait as a string using withDiscriminatorFieldName=None but that string is in PascalCase. How can I make it kebab case?
I tried withFieldNameMapper but I'm still getting PascalCase