Option[BigDecimal]
uuid
yet, while the java schema parser does. I wonder if the avro c libraries' parsers support it yet? You may like to troubleshoot by replacing uuid
type with string
type in your idl or schema
string
type instead of decimal
.
Unions, beyond nullable fields, are not supported.
union {null, decimal(20, 8)} foo;
sbt/sbt-avro
? It seems to have just up and vanished from GitHub… https://github.com/sbt/sbt-avro
sbt-avrohugger
to generate java and scala from avro schemas)
Hello! I was just trying out avrohugger (looking to introduce it generally to the organization), but am running into a rather confusing error when I try to generate a schema with enums. Do you have any advice?
val schemaStr =
"""
|{
| "type": "record",
| "name": "Employee",
| "fields": [
| {
| "name": "Gender",
| "type": "enum",
| "symbols": [ "male", "female", "non_binary" ]
| }
| ]
|}
|""".stripMargin
val myScalaTypes = Some(Standard.defaultTypes.copy(enum = EnumAsScalaString))
val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
generator.stringToFile(schemaStr)
Which throws:
ToolBoxError: reflective compilation has failed:
identifier expected but string literal found.
...
val myScalaTypes = Some(SpecificRecord.defaultTypes.copy(decimal = ScalaBigDecimal(Some(RoundingMode.HALF_EVEN))))
val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
, or I'd be curious if you had something else in mind
src_managed
. Avrohugger generates into that directory by default, in target
, intended for source files managed totally by sbt (and thus are excluded from publishing as well). Overriding that is probably a question for sbt, but I wonder if it'll work for you if you configure Avrohugger's output to use a directory in src
, e.g. https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/sbt-avrohugger/overridesettings/build.sbt#L12
avroSpecificScalaSource
. https://github.com/julianpeeters/sbt-avrohugger#settings