Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Julian Peeters
    @julianpeeters
    @NJAldwin Plans, yes julianpeeters/sbt-avrohugger#28
    Time-frame, no.
    I'd be happy to maintain it if someone wanted to submit a PR tho
    but currently it's pretty far down on my queue
    adityanahan
    @adityanahan
    anyone using avro c libraries here
    seeing a crash inside the avro apis

    #0 0x0000000001690608 in memcpy (n=37, src=0x7f058c2a88a0, dst=0x0)

    at /sources/sto/system/external/dpdk-18.05/x86_64-default-linuxapp-gcc/include/rte_memcpy.h:842

    #1 0x0000000001690608 in memcpy (n=37, src=0x7f058c2a88a0, dst=0x0)

    at /sources/sto/system/external/dpdk-18.05/x86_64-default-linuxapp-gcc/include/rte_memcpy.h:867

    #2 0x0000000001690608 in memcpy (str1=0x0, str2=0x7f058c2a88a0, n=37) at /sources/sto/system/lib/a10_dpdk/a10_dpdk_buffer.c:543

    #3 0x00007f0cbc9ae99e in avro_raw_string_set (str=0x7efc135524e8, src=0x7f058c2a88a0 "ba174a18-65a8-11e9-a6f2-3516e7b6bcbf")

    avro_raw_string_set -->
    Nick Aldwin
    @NJAldwin
    speaking of Avro issues
    apparently unions of logical types don’t work with the Java serialization?
    we’ve developed a schema with a union of null and a decimal logical type
    which avrohugger is able to correctly turn into Option[BigDecimal]
    but it seems that due to this bug, we cannot use avro files with such a field when it’s present with Avro’s Java implementation : https://issues.apache.org/jira/browse/AVRO-1891
    has anyone else run into this, and maybe found a workaround?
    Nick Aldwin
    @NJAldwin
    (there are the obvious workarounds of using a second boolean “presence” field, or using some sentinel value, or just using double (with the associated potential loss of precision), but I’m hoping maybe someone has found something I haven’t thought of here?)
    Julian Peeters
    @julianpeeters
    @adityanahan sorry, no experience personally. However, I recall that avro's java idl parser doesn't support uuid yet, while the java schema parser does. I wonder if the avro c libraries' parsers support it yet? You may like to troubleshoot by replacing uuid type with string type in your idl or schema
    @NJAldwin dang, that's unfortunate. Until that java avro codegen bug is fixed, I don't see a way to share the schema with a java team. I wonder if it may be worth considering losing a little type safety in your schema, and using string type instead of decimal.
    Tibo Delor
    @t-botz
    Hi there, wondering whether there 's an example of how to have nested types.
    I am trying to generate something like:
    case class Address(street: String)
    case class User(name: String, address1: Address,  address2: Address)
    I have tried to put all my types into one avsc like suggested here but it fails with Unions, beyond nullable fields, are not supported.
    Tibo Delor
    @t-botz
    Aight I figured it out just have to declare it in separate files and the type as "type": "myPackage.Address"
    Julien Truffaut
    @julien-truffaut
    Hi all, I am getting an ArithmeticException: Rounding necessary when I try to serialize a BigDecimal
    I used a field with union {null, decimal(20, 8)} foo;
    Julien Truffaut
    @julien-truffaut
    so I solved it by manually scaling my BigDecimal to the precision I used in the schema, e.g. bigDecimal.setScale(8, RoundingMode.HALF_DOWN)
    it would be great if we could specify the RoundingMode
    Julian Peeters
    @julianpeeters
    Thanks for the report @julien-truffaut , I'll take a look
    Nick Aldwin
    @NJAldwin
    This is a long shot, but anyone in here using sbt/sbt-avro? It seems to have just up and vanished from GitHub… https://github.com/sbt/sbt-avro
    (we use it in concert with sbt-avrohugger to generate java and scala from avro schemas)
    Nick Aldwin
    @NJAldwin
    following on from that ^ turns out the maintainer accidentally deleted it, and it has since been restored (albeit with all of its issues deleted)
    Jeff Kennedy
    @valesken

    Hello! I was just trying out avrohugger (looking to introduce it generally to the organization), but am running into a rather confusing error when I try to generate a schema with enums. Do you have any advice?

    val schemaStr =
        """
          |{
          |    "type": "record",
          |    "name": "Employee",
          |    "fields": [
          |        {
          |            "name": "Gender",
          |            "type": "enum",
          |            "symbols": [ "male", "female", "non_binary" ]
          |        }
          |    ]
          |}
          |""".stripMargin
      val myScalaTypes = Some(Standard.defaultTypes.copy(enum = EnumAsScalaString))
      val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
      generator.stringToFile(schemaStr)

    Which throws:

    ToolBoxError: reflective compilation has failed:
    identifier expected but string literal found.
        ...
    Jeff Kennedy
    @valesken
    Solved it by just creating a separate namespaced Gender enum and setting the field type in the Employee record to "Gender"
    Julian Peeters
    @julianpeeters
    @julien-truffaut if it's still of interest, I was able to reproduce the rounding bug when serializing decimals in SpecificRecords, fixed in RC21 by adding configurable rounding mode e.g. val myScalaTypes = Some(SpecificRecord.defaultTypes.copy(decimal = ScalaBigDecimal(Some(RoundingMode.HALF_EVEN))))
    Julien Truffaut
    @julien-truffaut
    @julianpeeters cheers
    where does this configuration go?
    jmgpeeters
    @jmgpeeters
    I see https://repo1.maven.org/maven2/com/julianpeeters/sbt-avrohugger_2.10_0.13/ only goes to 2.0.0-RC15, whereas there is already a -RC21 in more recent scala/sbt versions. was support dropped?
    I'm trying to investigate why my logicalType: date and timestamp-millis aren't represented by LocalDate & Instant anymore, between two projects. in my original one it was fine (where I have RC19), but now I'm on an older scala/sbt and they come out as "Long".
    jmgpeeters
    @jmgpeeters
    (worked around it for now with an explicit avroScalaSpecificCustomTypes in Compile := { avrohugger.format.SpecificRecord.defaultTypes.copy(timestampMillis = avrohugger.types.JavaTimeInstant)} in build.sbt, which seems to work fine)
    Julian Peeters
    @julianpeeters
    @julien-truffaut ah pardon. If you're using the sbt-avrohugger plugin, then here is an example https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/avrohugger/SpecificSerializationTests/build.sbt#L13, or using avrohugger directly would be something like
    val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes), or I'd be curious if you had something else in mind
    @jmgpeeters yes, support was dropped for sbt 0.13 around then, because cross-compiling for both 1.0.0 and 0.13 became infeasible. Glad you found a solution
    jmgpeeters
    @jmgpeeters
    OK, thanks. really cool project, by the way.
    danebert
    @danebert
    I've googled around and not found anything ... so I'm asking here :) ... I'm trying to produce a sources.jar for the generated code (using sbt-avrohugger and sbt-release) but the <schemaname>_2.12.sources.jar is empty (except for the META-INF/MANIFEST.MF) ... is there a configuration I can add that will fix this?
    Julian Peeters
    @julianpeeters
    @danebert I think the keyword you're looking for is src_managed. Avrohugger generates into that directory by default, in target, intended for source files managed totally by sbt (and thus are excluded from publishing as well). Overriding that is probably a question for sbt, but I wonder if it'll work for you if you configure Avrohugger's output to use a directory in src, e.g. https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/sbt-avrohugger/overridesettings/build.sbt#L12
    danebert
    @danebert
    I'm trying that out ... but I must be doing something wrong. I'm adding this to the build sbt but it is still putting the source in src_managed: avroScalaSource in Compile := new java.io.File(s"${baseDirectory.value}/src/main/scala"
    Julian Peeters
    @julianpeeters
    @danebert which format are you generating to? Maybe use avroSpecificScalaSource. https://github.com/julianpeeters/sbt-avrohugger#settings
    danebert
    @danebert
    ah ... that could be it ...
    I am using the 'specific'
    danebert
    @danebert
    yes ... that was it. Thanks fo your help!
    Julian Peeters
    @julianpeeters
    👍
    jrciii
    @jrciii
    Im getting errors for "type": ["null", "string"]. What should I use instead?
    The error is that Unions beyond nullable fields are not supported
    But shouldnt this just turn into an Option[String]?
    jrciii
    @jrciii
    Never mind, it was actually the fact that my schemas also have some fields with a union type with only one field. Not sure why the team did that -_-;