Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Nick Aldwin
    @NJAldwin
    which avrohugger is able to correctly turn into Option[BigDecimal]
    but it seems that due to this bug, we cannot use avro files with such a field when it’s present with Avro’s Java implementation : https://issues.apache.org/jira/browse/AVRO-1891
    has anyone else run into this, and maybe found a workaround?
    Nick Aldwin
    @NJAldwin
    (there are the obvious workarounds of using a second boolean “presence” field, or using some sentinel value, or just using double (with the associated potential loss of precision), but I’m hoping maybe someone has found something I haven’t thought of here?)
    Julian Peeters
    @julianpeeters
    @adityanahan sorry, no experience personally. However, I recall that avro's java idl parser doesn't support uuid yet, while the java schema parser does. I wonder if the avro c libraries' parsers support it yet? You may like to troubleshoot by replacing uuid type with string type in your idl or schema
    @NJAldwin dang, that's unfortunate. Until that java avro codegen bug is fixed, I don't see a way to share the schema with a java team. I wonder if it may be worth considering losing a little type safety in your schema, and using string type instead of decimal.
    Tibo Delor
    @t-botz
    Hi there, wondering whether there 's an example of how to have nested types.
    I am trying to generate something like:
    case class Address(street: String)
    case class User(name: String, address1: Address,  address2: Address)
    I have tried to put all my types into one avsc like suggested here but it fails with Unions, beyond nullable fields, are not supported.
    Tibo Delor
    @t-botz
    Aight I figured it out just have to declare it in separate files and the type as "type": "myPackage.Address"
    Julien Truffaut
    @julien-truffaut
    Hi all, I am getting an ArithmeticException: Rounding necessary when I try to serialize a BigDecimal
    I used a field with union {null, decimal(20, 8)} foo;
    Julien Truffaut
    @julien-truffaut
    so I solved it by manually scaling my BigDecimal to the precision I used in the schema, e.g. bigDecimal.setScale(8, RoundingMode.HALF_DOWN)
    it would be great if we could specify the RoundingMode
    Julian Peeters
    @julianpeeters
    Thanks for the report @julien-truffaut , I'll take a look
    Nick Aldwin
    @NJAldwin
    This is a long shot, but anyone in here using sbt/sbt-avro? It seems to have just up and vanished from GitHub… https://github.com/sbt/sbt-avro
    (we use it in concert with sbt-avrohugger to generate java and scala from avro schemas)
    Nick Aldwin
    @NJAldwin
    following on from that ^ turns out the maintainer accidentally deleted it, and it has since been restored (albeit with all of its issues deleted)
    Jeff Kennedy
    @valesken

    Hello! I was just trying out avrohugger (looking to introduce it generally to the organization), but am running into a rather confusing error when I try to generate a schema with enums. Do you have any advice?

    val schemaStr =
        """
          |{
          |    "type": "record",
          |    "name": "Employee",
          |    "fields": [
          |        {
          |            "name": "Gender",
          |            "type": "enum",
          |            "symbols": [ "male", "female", "non_binary" ]
          |        }
          |    ]
          |}
          |""".stripMargin
      val myScalaTypes = Some(Standard.defaultTypes.copy(enum = EnumAsScalaString))
      val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
      generator.stringToFile(schemaStr)

    Which throws:

    ToolBoxError: reflective compilation has failed:
    identifier expected but string literal found.
        ...
    Jeff Kennedy
    @valesken
    Solved it by just creating a separate namespaced Gender enum and setting the field type in the Employee record to "Gender"
    Julian Peeters
    @julianpeeters
    @julien-truffaut if it's still of interest, I was able to reproduce the rounding bug when serializing decimals in SpecificRecords, fixed in RC21 by adding configurable rounding mode e.g. val myScalaTypes = Some(SpecificRecord.defaultTypes.copy(decimal = ScalaBigDecimal(Some(RoundingMode.HALF_EVEN))))
    Julien Truffaut
    @julien-truffaut
    @julianpeeters cheers
    where does this configuration go?
    jmgpeeters
    @jmgpeeters
    I see https://repo1.maven.org/maven2/com/julianpeeters/sbt-avrohugger_2.10_0.13/ only goes to 2.0.0-RC15, whereas there is already a -RC21 in more recent scala/sbt versions. was support dropped?
    I'm trying to investigate why my logicalType: date and timestamp-millis aren't represented by LocalDate & Instant anymore, between two projects. in my original one it was fine (where I have RC19), but now I'm on an older scala/sbt and they come out as "Long".
    jmgpeeters
    @jmgpeeters
    (worked around it for now with an explicit avroScalaSpecificCustomTypes in Compile := { avrohugger.format.SpecificRecord.defaultTypes.copy(timestampMillis = avrohugger.types.JavaTimeInstant)} in build.sbt, which seems to work fine)
    Julian Peeters
    @julianpeeters
    @julien-truffaut ah pardon. If you're using the sbt-avrohugger plugin, then here is an example https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/avrohugger/SpecificSerializationTests/build.sbt#L13, or using avrohugger directly would be something like
    val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes), or I'd be curious if you had something else in mind
    @jmgpeeters yes, support was dropped for sbt 0.13 around then, because cross-compiling for both 1.0.0 and 0.13 became infeasible. Glad you found a solution
    jmgpeeters
    @jmgpeeters
    OK, thanks. really cool project, by the way.
    danebert
    @danebert
    I've googled around and not found anything ... so I'm asking here :) ... I'm trying to produce a sources.jar for the generated code (using sbt-avrohugger and sbt-release) but the <schemaname>_2.12.sources.jar is empty (except for the META-INF/MANIFEST.MF) ... is there a configuration I can add that will fix this?
    Julian Peeters
    @julianpeeters
    @danebert I think the keyword you're looking for is src_managed. Avrohugger generates into that directory by default, in target, intended for source files managed totally by sbt (and thus are excluded from publishing as well). Overriding that is probably a question for sbt, but I wonder if it'll work for you if you configure Avrohugger's output to use a directory in src, e.g. https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/sbt-avrohugger/overridesettings/build.sbt#L12
    danebert
    @danebert
    I'm trying that out ... but I must be doing something wrong. I'm adding this to the build sbt but it is still putting the source in src_managed: avroScalaSource in Compile := new java.io.File(s"${baseDirectory.value}/src/main/scala"
    Julian Peeters
    @julianpeeters
    @danebert which format are you generating to? Maybe use avroSpecificScalaSource. https://github.com/julianpeeters/sbt-avrohugger#settings
    danebert
    @danebert
    ah ... that could be it ...
    I am using the 'specific'
    danebert
    @danebert
    yes ... that was it. Thanks fo your help!
    Julian Peeters
    @julianpeeters
    👍
    jrciii
    @jrciii
    Im getting errors for "type": ["null", "string"]. What should I use instead?
    The error is that Unions beyond nullable fields are not supported
    But shouldnt this just turn into an Option[String]?
    jrciii
    @jrciii
    Never mind, it was actually the fact that my schemas also have some fields with a union type with only one field. Not sure why the team did that -_-;
    Jashan Goyal
    @jashangoyal09
    Hi Folks, I want to convert .avsc file to case class. For kafka streams serde.
    Thanks
    Julien Truffaut
    @julien-truffaut
    Hi all, by any chance do you have an example of using avro-hugger with kafka producer for a recent version of kafka client? e.g. > 2
    Roger Xu
    @rogerzxu
    Hi everyone, have a quick question I couldn't find the answer to: is there a way to configure the format of the generate sources?
    like adding newlines in between the fields, etc...
    Shivam Sharma
    @shivamsharma
    can we generate Java POJOs also using avrohugger?
    Floris Smit
    @florissmit10
    Hi, does anyone know how to override an "empty" namespace? My avro schemas are given and without a namespace
    Richard Gilmore
    @gilandose
    wondering for logical types is it possible to have them just as either Int or Long?
    Claudio Scandura
    @claudio-scandura
    Hi, does anybody know if Union types are supported when generating SpecificRecord classes? I get the following error: [error] (Compile / avroScalaGenerateSpecific) Unions beyond nullable fields are not supported, which seems to suggest the answer is no. Any idea?
    zachkirlew
    @zachkirlew

    Hey @julianpeeters avrohugger does not seem to support indirect recursion in schemas despite it being fine in avro. e.g.

    {
      "type": "record",
      "name": "A",
      "fields": [
        {
          "name": "B",
          "type": [
            "null",
            {
              "name": "C",
              "type": "record",
              "fields": [
                {
                  "name": "D",
                  "type": [
                    "null",
                    "A"
                  ]
                },
              ]
            }
          ]
        }
      ]
    }

    Avrohugger is throwing a key not found error.

    I did some investigating into the code and I noticed that problem is with the order of type registration and compilation. There is a SourceFormat.registerType method call for each “compilation unit”. I imagine the assumption was that as a topological search is being done first then it would be safe to do registration just before compiling each unit. However in order to support indirect recursion you would have to to register all types first before you can start compiling them.

    Would it be possible to change this in order to support indirect recursion?

    Thomas
    @thomasschoeftner
    hi folks - quick question:
    We need to split Avro models across different sbt projects.
    Is it possible with sbt & avro-hugger to use a model from an .avsc in one sbt sub-project, in an .avsc file in another sbt sub-project?
    With default settings, avro-hugger keeps complaining that the required type from the other project is not found.
    Thx a lot!