Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Josh McDade
    Thanks @julianpeeters I guess just assumed IDL and avsc were interchangable definitions
    Hello @julianpeeters , Firstly, thank you for the awesome project.
    I just want to know if there is some special configuration needed on Nexus proxy to resolve the plugin dependency.
    addSbtPlugin("com.julianpeeters" % "sbt-avrohugger" % "2.0.0-RC15")
    The artifact gets resolved without my nexus proxy. However, on using the proxy it doesn't resolve.
    Although I can clearly see it here https://repo.maven.apache.org/maven2/com/julianpeeters/sbt-avrohugger_2.12_1.0/2.0.0-RC15/
    the proxy seems to have a problem fetching, and i suspect it is due to the suffix 2.12_1.0. I just wanted to know if anyone else has faced this issue and how they went about solving it..
    I was able to resolve the issue by setting the nexus proxy layout to permissive
    Julian Peeters
    thanks a ton @pascals-ager . I'll add a note about that to the readme
    Nick Aldwin
    :wave: hello! We’re starting to use sbt-avrohugger. The plugin itself works great to generate scala. However, I’m running into a confusing error.
    If I start with a clean, tests-passing tree of our master branch, without the plugin, then I add sbt-avrohugger to plugins.sbt with no other changes, suddenly the tests in my submodule are unable to resolve any dependencies
    that is, literally the diff is just the addSbtPlugin and nothing else, and suddenly tests are not happy
    Any ideas what could be going on here? I’m using 2.0.0-RC16 if that helps
    happy to open a ticket too but figured I could try here first
    1. sbt the-submodule/clean the-submodule/test succeeds
    2. Add addSbtPlugin line to plugins.sbt (there are multiple other plugins already there, working fine)
    3. sbt the-submodule/clean the-submodule/test fails with no dependencies seemingly able to resolve
      (i.e. even scalatest isn’t there, and we get value should is not a member of String for a ”foo” should “bar” in {)
    Nick Aldwin
    I did a little more digging, and it seems to work fine in RC15 — so this is due to some change between RC15 and RC16
    Julian Peeters
    @NJAldwin wow, bizarre, thank you very much for the report. This makes me realize that avrohugger indeed doesn't have a test with submodules. I'll see if I can reproduce this.
    from RC15 to RC16, I tried to match java avro by adding a way to check the classpath during compilation -- something seems to have very wrong!
    I expect I can look more closely tomorrow
    Nick Aldwin
    Thanks, I appreciate your looking into it!
    I’ve a mostly-unrelated (probably noob) question. I wrote a small IDL file and avrohugger is able to use it to generate a case class. However, if I want to actually use Avro to deserialize, I have to load the schema separately — and avro cannot load IDLs. Am I missing something here or does that mean that if I want to actually use Avro to serialize/deserialize to/from these case classes, I must either write the JSON schema directly or try to build in the avrohugger-tools jar into the build process somehow? I really feel like I’m missing something important that’s resulting in my making all of this much harder for myself
    Nick Aldwin
    fwiw I think what I was missing above was that I had some incorrect information on generic records vs specific records
    I see now that generating specific records seems to do what I was looking for
    for some reason some of the docs I was reading seemed to suggest that these were differences in the resulting format rather than just avro’s java api
    Julian Peeters
    Thanks again @NJAldwin , submodules should be working again, now in RC17
    Nick Aldwin
    Thanks @julianpeeters ! Glad it doesn’t seem to have been too hard to track down

    By the way, are there any plans to support intermediate output of schema JSON files?

    Despite my misunderstandings above, it would still be really nice with how we plan on using our shemas if we could have the one compilation pipeline which is already reading IDL and translating to JSON to be able to dump that in the jar’s resources or somewhere.

    A lot of libs (e.g. other languages’ avro support) don’t support IDL, and trying to plug avro-tools into the toolchain when avrohugger is already reading the files seems like an exercise in pain

    Julian Peeters
    @NJAldwin Plans, yes julianpeeters/sbt-avrohugger#28
    Time-frame, no.
    I'd be happy to maintain it if someone wanted to submit a PR tho
    but currently it's pretty far down on my queue
    anyone using avro c libraries here
    seeing a crash inside the avro apis

    #0 0x0000000001690608 in memcpy (n=37, src=0x7f058c2a88a0, dst=0x0)

    at /sources/sto/system/external/dpdk-18.05/x86_64-default-linuxapp-gcc/include/rte_memcpy.h:842

    #1 0x0000000001690608 in memcpy (n=37, src=0x7f058c2a88a0, dst=0x0)

    at /sources/sto/system/external/dpdk-18.05/x86_64-default-linuxapp-gcc/include/rte_memcpy.h:867

    #2 0x0000000001690608 in memcpy (str1=0x0, str2=0x7f058c2a88a0, n=37) at /sources/sto/system/lib/a10_dpdk/a10_dpdk_buffer.c:543

    #3 0x00007f0cbc9ae99e in avro_raw_string_set (str=0x7efc135524e8, src=0x7f058c2a88a0 "ba174a18-65a8-11e9-a6f2-3516e7b6bcbf")

    avro_raw_string_set -->
    Nick Aldwin
    speaking of Avro issues
    apparently unions of logical types don’t work with the Java serialization?
    we’ve developed a schema with a union of null and a decimal logical type
    which avrohugger is able to correctly turn into Option[BigDecimal]
    but it seems that due to this bug, we cannot use avro files with such a field when it’s present with Avro’s Java implementation : https://issues.apache.org/jira/browse/AVRO-1891
    has anyone else run into this, and maybe found a workaround?
    Nick Aldwin
    (there are the obvious workarounds of using a second boolean “presence” field, or using some sentinel value, or just using double (with the associated potential loss of precision), but I’m hoping maybe someone has found something I haven’t thought of here?)
    Julian Peeters
    @adityanahan sorry, no experience personally. However, I recall that avro's java idl parser doesn't support uuid yet, while the java schema parser does. I wonder if the avro c libraries' parsers support it yet? You may like to troubleshoot by replacing uuid type with string type in your idl or schema
    @NJAldwin dang, that's unfortunate. Until that java avro codegen bug is fixed, I don't see a way to share the schema with a java team. I wonder if it may be worth considering losing a little type safety in your schema, and using string type instead of decimal.
    Tibo Delor
    Hi there, wondering whether there 's an example of how to have nested types.
    I am trying to generate something like:
    case class Address(street: String)
    case class User(name: String, address1: Address,  address2: Address)
    I have tried to put all my types into one avsc like suggested here but it fails with Unions, beyond nullable fields, are not supported.
    Tibo Delor
    Aight I figured it out just have to declare it in separate files and the type as "type": "myPackage.Address"
    Julien Truffaut
    Hi all, I am getting an ArithmeticException: Rounding necessary when I try to serialize a BigDecimal
    I used a field with union {null, decimal(20, 8)} foo;