Hi, not sure if I should create a Github issue for it, but here's my problem:
Kryo 5.0.0 fails to instantiate on Android 11 with this exception:
java.lang.NoSuchMethodError: No static method of()Ljava/util/Map; in class Ljava/util/Map$-DC; or its super classes
at com.esotericsoftware.kryo.serializers.ImmutableCollectionsSerializers$JdkImmutableMapSerializer.addDefaultSerializers(ImmutableCollectionsSerializers.java:118)
at com.esotericsoftware.kryo.serializers.ImmutableCollectionsSerializers.addDefaultSerializers(ImmutableCollectionsSerializers.java:39)
at com.esotericsoftware.kryo.Kryo.<init>(Kryo.java:231)
...
For some reason, there was no problem for it to add default serializers for List.of().
Not sure what is going on here, somehow it found "java.util.ImmutableCollections" class but failed to call java.util.Map.of(). The device is Pixel 4 with official firmware so I'd expect such behavior on many other devices.
Is there any way to force Kryo not to load default serializers for immutable collections? Maybe some gradle magic which allows to replace a class or something. I could think of a couple of ways to solve this but all of them would require to change Kryo's source code:
Map.of(), List.of(), Set.of() / copyOf() can't be resolved in my ide running on sourceCompatability of Java 8 - I thought it might be the case but still not sure upgrading to Java 9+ makes sense for Android (also would require a bit of .gradle refactoring which I'm not a fan of) so had no chance checking it
If anyone can explain anything of these to me I'll be very grateful!
Also thanks for making Kryo, it is cool
java.util.ImmutableCollections
is on the classpath:if (isClassAvailable("java.util.ImmutableCollections")) {
JdkImmutableListSerializer.addDefaultSerializers(kryo);
JdkImmutableMapSerializer.addDefaultSerializers(kryo);
JdkImmutableSetSerializer.addDefaultSerializers(kryo);
}
ImmutableCollectionsSerializers.addDefaultSerializers()
and check why isClassAvailable
returns true
?
Map.of
overloads should be available
Hello ,
I have a problem of deserailisation with a complex class case (see log),
Ps => despite the warning, I noticed nothing as a malfunction,
I use in my project:
akka 2.6.11 ( akka cluster , akka streams , akka pubsub )
scala 2.12,
play 2.6
log server:
[warn] 2020-11-30 19:09:31,504 - akka.remote.artery.Deserializer - Failed to deserialize message from [akka://application@127.0.0.1:2551] with serializer id [123454323] and manifest []. com.esotericsoftware.kryo.KryoException: java.lang.NullPointerException
Serialization trace:
underlying (play.api.libs.json.JsObject)
data (models.TaskModel)
task (actors.websockets.ClientActor$TaskDeleted)
[info] 2020-11-30 19:09:31,504 - actors.websockets.ClientActor - task deleted, id: 66657
[warn] 2020-11-30 19:09:31,504 - akka.remote.artery.Deserializer - Failed to deserialize message from [akka://application@127.0.0.1:2551] with serializer id [123454323] and manifest []. com.esotericsoftware.kryo.KryoException: java.lang.NullPointerException
Hello, I am using Kryo 4.0.2 together with "de.javacaffee" % "kryo-serializers" % "0.45" as well as "io.altoo" % "akka-kryo-serialization" % "1.1.0" to do serialization between Akka nodes. Recently we had a case class that had a play.api.libs.json.JsValue as type. JsValue is a trait with various concrete classes for the various parts of a JSON tree (JSString, JSObject, etc...). I noticed that when trying to transfer the case class between nodes, a NullpointerException was raised and the connection between the nodes got disrupted. I only got it to work by explicitly declaring a custom serializer for each concrete implementation of JsValue as indicated below :
abstract class PlayJsonSerializer[T <: JsValue] extends Serializer[T] {
override def write(kryo: Kryo, output: Output, `object`: T): Unit = output.writeString(Json.stringify(`object`))
override def read(kryo: Kryo, input: Input, `type`: Class[T]): T = Json.parse(input.readString()).asInstanceOf[T]
}
class JsValueSerializer extends PlayJsonSerializer[JsValue]
class JsObjectSerializer extends PlayJsonSerializer[JsObject]
class JsStringSerializer extends PlayJsonSerializer[JsString]
class JsNumberSerializer extends PlayJsonSerializer[JsNumber]
class JsBooleanSerializer extends PlayJsonSerializer[JsBoolean]
class JsArraySerializer extends PlayJsonSerializer[JsArray]
class JsNullSerializer extends PlayJsonSerializer[JsNull.type]
Is this the only way to serialize a Play Json value with Kryo ? Or am I missing something ?
kryo.writeObject
. On the read side I want to stream through this file completely. Waiting for the underflow doesn't seem good, .available
being zero doesn't actually prove there are no bytes left. I'd expect to be able to call ReadObjectOrNull and have it return null if at EOS but I still get the underflow exception. while(input.limit() > input.position() || input.position() == 0)
feels a bit odd, especially with the extra start of stream check. Am I missing something blindingly obvious. Something like Files.lines
that just gives me a Stream<MyObj>
might be nice but Stream support definitely unnecessary.
I'm trying to upgrade from Java 11 to 17 running Kryo 5 with Unsafe I/O, but am getting a weird NoClassDefFoundError
for UnsafeUtil
while writing data with writeClassAndObject
:
Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.esotericsoftware.kryo.kryo5.unsafe.UnsafeUtil
at com.esotericsoftware.kryo.kryo5.unsafe.UnsafeOutput.writeByte(UnsafeOutput.java:93)
at com.esotericsoftware.kryo.kryo5.util.DefaultClassResolver.writeName(DefaultClassResolver.java:123)
at com.esotericsoftware.kryo.kryo5.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:114)
at com.esotericsoftware.kryo.kryo5.Kryo.writeClass(Kryo.java:613)
at com.esotericsoftware.kryo.kryo5.Kryo.writeClassAndObject(Kryo.java:708)
I also tested moving away from Unsafe*
to use Input
/Output
and it works fine. Now I'm wondering how do I get Unsafe to work again.
Did anyone face similar issue and/or would have a solution for it?
Hello everyone
I'm running my Apache spark 3.3.0 version pipeline on EMR 6.8
I'm creating my Spark Session with those two options set:
.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
.set("spark.kryo.registrationRequired", "true")
In my ClassPath I only have "io.altoo" %% "akka-kryo-serialization" % "2.1.0" and when I was trying to exclude these:
ExclusionRule("org.objenesis", "objenesis"),
ExclusionRule("com.esotericsoftware", "minlog"),
ExclusionRule("com.esotericsoftware", "reflectasm")
It doesn't help, so I think this is not an issue
I keep getting the following error:
Exception in thread "pool-5-thread-47" java.lang.NoSuchMethodError: com.esotericsoftware.kryo.serializers.FieldSerializer.setIgnoreSyntheticFields(Z)V
at com.twitter.chill.KryoBase.newDefaultSerializer(KryoBase.scala:67)
at com.esotericsoftware.kryo.Kryo.getDefaultSerializer(Kryo.java:387)
at com.esotericsoftware.kryo.Kryo.register(Kryo.java:416)
at org.apache.spark.serializer.KryoSerializer.$anonfun$newKryo$2(KryoSerializer.scala:141)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:140)
at org.apache.spark.serializer.KryoSerializer$$anon$1.create(KryoSerializer.scala:102)
at com.esotericsoftware.kryo.pool.KryoPoolQueueImpl.borrow(KryoPoolQueueImpl.java:48)
Could someone please help me where to look for a solution?
greetings -- i wanted to know how to approach solving for:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 63.0 failed 4 times, most recent failure: Lost task 0.3 in stage 63.0 (TID 305) (172.16.3.239 executor 4): com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 13994
at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:137)
at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:804)
at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:307)
at org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:196)
i'm streaming into spark using the rawinputdstream API -- each frame is [int32 size, (serialized object...)]
(what rawSocketStream
expects). the serialized object in my case is a byte[]
. works fine with the default java serializer. tried preregistering Class.forName("[B")
but i think that chill does all of this init. trace logging doesn't show me any objects with ID 13994
. does anyone recognize what this object could be?