Hi @zhoekstra_gitlab I see the tasks you're using are for generating classes in Standard
format, but the task your customizing is for SpecificRecord
. Maybe you're aiming for avrohugger.format.Standard.defaultTypes.copy( enum = ScalaCaseObjectEnum)
.
Thanks for asking, I've updated the README to try to make that a little clearer.
Seems like it works the way I want! Thanks a lot, Julian!
In the mean time, I've sketched a bug fix for one of the bugs, would love to hear your opinion on it
the doc mentions
Standard Vanilla case classes (for use with Apache Avro's GenericRecord API, etc.)
but I am a bit puzzled how I can shovel those to generic record API. Can you come up with an example?
in particular, how can I map generated case class to GenericRecord
?
@julianpeeters Do you know an avro client runtime implementation that builds with scalajs ?
What is the distance between what avrohugger does and such a client runtime?
Should avrohugger
be able to handle something like
https://github.com/SemanticBeeng/RPC-Avro-Seed/blob/6a2f99ba78356e962d40967d532a79d991db37ab/server/modules/protocol/src/main/resources/PeopleService.avdl#L2-L7
Just looking for insights from building avrohugger.
Would you say it would be easy to build such a runtime with hammock ?
http://pepegar.com/hammock/marshalling.html.
Thanks
sbt-avrohugger
is published on maven central? Because looks like it doesn't - https://mvnrepository.com/artifact/com.julianpeeters/sbt-avrohugger returns 404. Or have you moved it to some other repository?
@SemanticBeeng no, I don't know of an avro runtime for scalajs. There's the javascript project called AVSC, might be able to make a facade for that. Writing a new runtime library would probably look a lot like circe, and/or like avro4s, which I imagine would then be integrated into hammock like the circe example you linked.
I'd expect avrohugger will able to generate from- and to- the directories you want
RPC support is limited to SpecificRecord
format, however, otherwise the avro message definitions are ignored. I'd imagine you'd want to use Standard
vanilla case classes with Scalajs. I believe 47 degrees uses some post-processing of avrohugger's output in order to handle generating definintions for RPC/message.
@julianpeeters et al, I see some previous discussions about supporting joda DateTime
. We have hacked avrohugger.matchers.TypeMatcher
in our fork to support DateTime
in the following fashion:
case Schema.Type.STRING =>
// bit of a HACK: to get joda DateTime to gen.
// note that "datetime" does not make it into "getLogicalType" since avro (java project 1.8.2) filters out types not in the spec
if(scala.util.Try(schema.getObjectProp("logicalType")).toOption.getOrElse("") == "datetime")
RootClass.newClass(nme.createNameType("org.joda.time.DateTime"))
else if(scala.util.Try(schema.getObjectProp("logicalType")).toOption.getOrElse("") == "localdate")
RootClass.newClass(nme.createNameType("org.joda.time.LocalDate"))
(credit goes to my former colleague not in this chat)
We wanted to start a conversation on what people think on having something like this making into the main repo.
I know the above looks very hacky, so maybe allowing TypeMatcher to be extendable to support custom types? This way we don't force it on everyone.
@deklanw I'm not experienced with Flink, and I don't recognize how that Order
could be coming from avrohugger, however the fact that we're seeing a reflection-related error might suggest that indeed the avrohugger-generated Scala classes are the cause. Scala can't be reflected by Java very well, and avro (and frameworks' avro integration) loves to try: https://github.com/apache/flink/blob/master/flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroInputFormat.java#L116
I imagine you're trying with the Specific
api? I'd recommend sending a PR to Flink to use the avro methods that take a schema as an argument instead of getting it by reflection.
Alternatively, you could try using the avro Generic api that it looks like Flink also supports. You'd probably want to generate Scala classes as avrohugger's Standard
format, and then make a function to convert them into GenericRecord
s before using them with Flink. You may look into Avro4s to automate that last part, as I believe it uses GenericRecord under the hood, but beware it also regenerates it's own schema (from the class definition, and which usually matches the schema used to generate the class, but not always)
logicalType
is a property of types, not fields. https://avro.apache.org/docs/1.8.2/spec.html#Logical+Typeshi, i'm just getting into avro, so newbie question. Using the sbt plugin to generated case classes. From what I read in the documentation, I thought this idl would generate a case class with a UUID field, but it's still string.
@namespace ("com.example.directory.event")
protocol Test {
record TestRecord {
@logicalType("uuid") string id;
}
}
// case class TestRecord(id: String)
What am I missing?
addSbtPlugin("com.julianpeeters" % "sbt-avrohugger" % "2.0.0-RC15")
https://repo.maven.apache.org/maven2/com/julianpeeters/sbt-avrohugger_2.12_1.0/2.0.0-RC15/
2.12_1.0
. I just wanted to know if anyone else has faced this issue and how they went about solving it..