By the way, are there any plans to support intermediate output of schema JSON files?
Despite my misunderstandings above, it would still be really nice with how we plan on using our shemas if we could have the one compilation pipeline which is already reading IDL and translating to JSON to be able to dump that in the jar’s resources or somewhere.
A lot of libs (e.g. other languages’ avro support) don’t support IDL, and trying to plug avro-tools into the toolchain when avrohugger is already reading the files seems like an exercise in pain
#0 0x0000000001690608 in memcpy (n=37, src=0x7f058c2a88a0, dst=0x0)
at /sources/sto/system/external/dpdk-18.05/x86_64-default-linuxapp-gcc/include/rte_memcpy.h:842
#1 0x0000000001690608 in memcpy (n=37, src=0x7f058c2a88a0, dst=0x0)
at /sources/sto/system/external/dpdk-18.05/x86_64-default-linuxapp-gcc/include/rte_memcpy.h:867
#2 0x0000000001690608 in memcpy (str1=0x0, str2=0x7f058c2a88a0, n=37) at /sources/sto/system/lib/a10_dpdk/a10_dpdk_buffer.c:543
#3 0x00007f0cbc9ae99e in avro_raw_string_set (str=0x7efc135524e8, src=0x7f058c2a88a0 "ba174a18-65a8-11e9-a6f2-3516e7b6bcbf")
Option[BigDecimal]
uuid
yet, while the java schema parser does. I wonder if the avro c libraries' parsers support it yet? You may like to troubleshoot by replacing uuid
type with string
type in your idl or schema
string
type instead of decimal
.
Unions, beyond nullable fields, are not supported.
union {null, decimal(20, 8)} foo;
sbt/sbt-avro
? It seems to have just up and vanished from GitHub… https://github.com/sbt/sbt-avro
sbt-avrohugger
to generate java and scala from avro schemas)
Hello! I was just trying out avrohugger (looking to introduce it generally to the organization), but am running into a rather confusing error when I try to generate a schema with enums. Do you have any advice?
val schemaStr =
"""
|{
| "type": "record",
| "name": "Employee",
| "fields": [
| {
| "name": "Gender",
| "type": "enum",
| "symbols": [ "male", "female", "non_binary" ]
| }
| ]
|}
|""".stripMargin
val myScalaTypes = Some(Standard.defaultTypes.copy(enum = EnumAsScalaString))
val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
generator.stringToFile(schemaStr)
Which throws:
ToolBoxError: reflective compilation has failed:
identifier expected but string literal found.
...
val myScalaTypes = Some(SpecificRecord.defaultTypes.copy(decimal = ScalaBigDecimal(Some(RoundingMode.HALF_EVEN))))
val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
, or I'd be curious if you had something else in mind
src_managed
. Avrohugger generates into that directory by default, in target
, intended for source files managed totally by sbt (and thus are excluded from publishing as well). Overriding that is probably a question for sbt, but I wonder if it'll work for you if you configure Avrohugger's output to use a directory in src
, e.g. https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/sbt-avrohugger/overridesettings/build.sbt#L12