plokhotnyuk on master
Code clean up (compare)
plokhotnyuk on master
Code clean up (compare)
plokhotnyuk on master
Code clean up (compare)
plokhotnyuk on master
Code clean up (compare)
plokhotnyuk on master
Code clean up (compare)
plokhotnyuk on master
Fix a missing error on unexpect… (compare)
plokhotnyuk on master
Update sbt-sonatype to 3.9.13 (… (compare)
plokhotnyuk on v2.13.26
plokhotnyuk on master
Setting version to 2.13.26 Setting version to 2.13.27-SNAP… (compare)
plokhotnyuk on master
Fix parsing of invalid characte… (compare)
plokhotnyuk on master
Fix parsing of invalid characte… (compare)
plokhotnyuk on master
Fix parsing of invalid characte… (compare)
plokhotnyuk on master
More efficient parsing of `BigI… (compare)
plokhotnyuk on master
More efficient parsing of `BigI… (compare)
plokhotnyuk on gh-pages
Update JVM results for v2.13.26… (compare)
plokhotnyuk on master
Update acknowledgments links (compare)
plokhotnyuk on master
More efficient parsing of `java… (compare)
plokhotnyuk on master
More efficient parsing of `java… (compare)
git clone --depth 1000 --branch master git@github.com:plokhotnyuk/jsoniter-scala.git
, build and publish locally a snapshot version with sbt clean publishLocal
and then test the 2.8.2-SNAPSHOT
version if it works for you?
jsoniter-scala
Scala 3 compatible? I thought it's not doable because of macros, and there is no hope for that.jsoniter-scala-core
with 3.0.0-RC3 and 3.0.0 versions but failed both attempts due some classpath problems with the sonatypeBundleRelease
command: xerial/sbt-sonatype#230
null
. However in that scenario (at least I think that's what happening) jsoniter will freak out because I believe it wants valid json, but is instead receiving null. And then you're met with this
[info] org.eclipse.lsp4j.jsonrpc.ResponseErrorException: com.github.plokhotnyuk.jsoniter_scala.core.JsonReaderException: expected '{', offset: 0x00000000, buf:
[info] +----------+-------------------------------------------------+------------------+
[info] | | 0 1 2 3 4 5 6 7 8 9 a b c d e f | 0123456789abcdef |
[info] +----------+-------------------------------------------------+------------------+
[info] | 00000000 | 6e 75 6c 6c | null |
[info] +----------+-------------------------------------------------+------------------+
the way we are doing these requests are like
case class Reload()
object Reload {
implicit val codec: JsonValueCodec[Reload] =
JsonCodecMaker.make
}
But is there another way to do this that would allow for null
?
Option[Reload]
instead, so null
JSON values will be parsed to the None
object reference. But if it is not an option for your than some delegating custom codec can be created to wrap derived one by the standard make
: def wrapByNullAllowingCodec[A](codec: JsonValueCodec[A]): JsonValueCodec[A] = new JsonValueCodec[A] {
override def decodeValue(in: JsonReader, default: A): A =
if (in.isNextToken('n')) {
in.readNullOrError[String]("x", "expected value or null")
nullValue
} else {
in.rollbackToken()
codec.decodeValue(in, default)
}
override def encodeValue(x: A, out: JsonWriter): Unit =
if (x == null) out.writeNull()
else codec.encodeValue(x, out)
override val nullValue: A = null.asInstanceOf[A]
}
wrapByNullAllowingCodec
I was actually just going down that path so it's good to see that I was headed in the right direction
I was wondering if I had:
case class(id: String, fields: Map[String, String], num: Int)
// How can I read in the following JSON:
{
"id" : "FOO",
"num": 456,
"fields" : {
"time" : 123,
"bar" : "baz"
}
}
I guess a custom codec? The Stringified annotation or config doesn't seem to apply here
final case class StringMapValue(underlying: String) extends AnyVal
final case class(id: String, fields: Map[String, StringMapValue], num: Int)
implicit val mapValueCodec: JsonValueCodec[StringMapValue] = new JsonValueCodec[StringMapValue]() {
override def decodeValue(in: JsonReader, default: StringMapValue): StringMapValue = {
val b = in.nextToken()
if (b == '"') {
in.rollbackToken()
StringMapValue(in.readString(null))
} else if ((b >= '0' || b <= '9') || b == '-') {
in.rollbackToken()
val d = in.readDouble()
val i = d.toInt
if (i.toDouble == d) {
StringMapValue(i.toString)
} else {
StringMapValue(d.toString)
}
} else if (b == 'n') {
in.rollbackToken()
in.readNullOrError(null, "expected `null` value")
nullValue
} else if (b == 't' || b == 'f') {
in.rollbackToken()
StringMapValue(in.readBoolean().toString)
} else {
in.decodeError("expected JSON value")
}
}
override def encodeValue(x: StringMapValue, out: JsonWriter): Unit = {
out.writeVal(x.underlying)
}
override val nullValue: StringMapValue = StringMapValue(null)
}
fields
values (except null) as strings and then will not able to parse them. For parsed nulls it will just throw NPEs during encoding. Also it have a couple of bug in parsing of numbers and null values. To avoid such problems you can define an ADT (or sum type) for fields
values and use the following codec:import com.github.plokhotnyuk.jsoniter_scala.macros._
import com.github.plokhotnyuk.jsoniter_scala.core._
object Example01 {
sealed trait MapValue
case class StringMapValue(value: String) extends MapValue
case class BooleanMapValue(value: Boolean) extends MapValue
case class DoubleMapValue(value: Double) extends MapValue
case class IntMapValue(value: Int) extends MapValue
case object NullMapValue extends MapValue
case class Data(id: String, fields: Map[String, MapValue], num: Int)
implicit val mapValueCodec: JsonValueCodec[MapValue] = new JsonValueCodec[MapValue] {
override def decodeValue(in: JsonReader, default: MapValue): MapValue = {
val b = in.nextToken()
if (b == '"') {
in.rollbackToken()
StringMapValue(in.readString(null))
} else if (b >= '0' && b <= '9' || b == '-') {
in.rollbackToken()
val d = in.readDouble()
val i = d.toInt
if (i.toDouble == d) IntMapValue(i)
else DoubleMapValue(d)
} else if (b == 't' || b == 'f') {
in.rollbackToken()
BooleanMapValue(in.readBoolean())
} else if (b == 'n') {
in.readNullOrError(default, "expected `null` value")
NullMapValue
} else in.decodeError("expected JSON value")
}
override def encodeValue(x: MapValue, out: JsonWriter): Unit = x match {
case NullMapValue => out.writeNull()
case StringMapValue(x) => out.writeVal(x)
case BooleanMapValue(x) => out.writeVal(x)
case IntMapValue(x) => out.writeVal(x)
case DoubleMapValue(x) => out.writeVal(x)
}
override def nullValue: MapValue = NullMapValue
}
implicit val codec: JsonValueCodec[Data] = JsonCodecMaker.make
def main(args: Array[String]): Unit = {
val data = readFromArray[Data]("""{
| "id" : "FOO",
| "num": 456,
| "fields" : {
| "time" : 123,
| "bar" : "baz",
| "bool" : true,
| "double" : 123.456890123,
| "null" : null
| }
|}""".stripMargin.getBytes("UTF-8"))
val json = writeToArray(Data(id = "BAR", num = 123,
fields = Map[String, MapValue]("time" -> IntMapValue(123), "bar" -> StringMapValue("baz"))))
println(data)
println(new String(json, "UTF-8"))
}
}
fields
values can be arbitrary JSON values and they could be mutable then I would recommend to use Dijon's SomeJson
for them that uses jsoniter-scala's codec to parse and encode it.
fields
values not need to be handled and just need to be passed back to encoded JSON then for the fields
field use RawVal
type like here.
Array
and all other Scala collections from the standard library, but any collection can be parsed with a custom codec that uses jsoniter-scala-core API. Throughput of collection parsing depends a lot on how fast an instance of the collection can be created. On the page with benchmark results you check results for ArrayBufferOfBooleansReading
, ArrayOfBooleansReading
, ListOfBooleansReading
and VectorOfBooleansReading
to have a rough estimation. For more concrete results you can modify and run corresponding benchmarks on your environment.
id: Int, parentId: Int, installed: Int, model: String
My Akka-Http router tests naturally tests A, B and C via their respective routers, and all effected database tables are good. This works with uPickle, for instance.
https://github.com/objektwerks/pwa.pool/blob/master/shared/src/main/scala/pool/Entity.scala