value.serializerin your config. Use the manually created instances instead.
GenericRecordshould work since it just makes sure that only
GenericRecordvalues pass through it to the
GenericAvroSerializerand consume it like in https://github.com/ferhtaydn/sack/tree/master/src/main/scala/com/ferhtaydn/sack/avro
the magic byte0x0 in first byte
hey @jkpl, the problem is solved by using raw
KafkaAvroSerializer by giving
Object type to both producer and records. Strictly generic typed consumers and producers are a bit difficult to use for this level of kafka ecosystem stack I think, when you give everything with
Object it just works :) the last comment for issue, you may want to check. datamountaineer/stream-reactor#80
thanks for help.
You just need to set the "enable.idempotence" option on the producer. See the kafka docs here: http://kafka.apache.org/0110/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html
The config helper classes in scala-kafka-client now have a parameter (KafkaProducer.Conf -> enableIdempotence: Boolean = false)