Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Julien Truffaut
    @julien-truffaut
    so I solved it by manually scaling my BigDecimal to the precision I used in the schema, e.g. bigDecimal.setScale(8, RoundingMode.HALF_DOWN)
    it would be great if we could specify the RoundingMode
    Julian Peeters
    @julianpeeters
    Thanks for the report @julien-truffaut , I'll take a look
    Nick Aldwin
    @NJAldwin
    This is a long shot, but anyone in here using sbt/sbt-avro? It seems to have just up and vanished from GitHub… https://github.com/sbt/sbt-avro
    (we use it in concert with sbt-avrohugger to generate java and scala from avro schemas)
    Nick Aldwin
    @NJAldwin
    following on from that ^ turns out the maintainer accidentally deleted it, and it has since been restored (albeit with all of its issues deleted)
    Jeff Kennedy
    @valesken

    Hello! I was just trying out avrohugger (looking to introduce it generally to the organization), but am running into a rather confusing error when I try to generate a schema with enums. Do you have any advice?

    val schemaStr =
        """
          |{
          |    "type": "record",
          |    "name": "Employee",
          |    "fields": [
          |        {
          |            "name": "Gender",
          |            "type": "enum",
          |            "symbols": [ "male", "female", "non_binary" ]
          |        }
          |    ]
          |}
          |""".stripMargin
      val myScalaTypes = Some(Standard.defaultTypes.copy(enum = EnumAsScalaString))
      val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes)
      generator.stringToFile(schemaStr)

    Which throws:

    ToolBoxError: reflective compilation has failed:
    identifier expected but string literal found.
        ...
    Jeff Kennedy
    @valesken
    Solved it by just creating a separate namespaced Gender enum and setting the field type in the Employee record to "Gender"
    Julian Peeters
    @julianpeeters
    @julien-truffaut if it's still of interest, I was able to reproduce the rounding bug when serializing decimals in SpecificRecords, fixed in RC21 by adding configurable rounding mode e.g. val myScalaTypes = Some(SpecificRecord.defaultTypes.copy(decimal = ScalaBigDecimal(Some(RoundingMode.HALF_EVEN))))
    Julien Truffaut
    @julien-truffaut
    @julianpeeters cheers
    where does this configuration go?
    jmgpeeters
    @jmgpeeters
    I see https://repo1.maven.org/maven2/com/julianpeeters/sbt-avrohugger_2.10_0.13/ only goes to 2.0.0-RC15, whereas there is already a -RC21 in more recent scala/sbt versions. was support dropped?
    I'm trying to investigate why my logicalType: date and timestamp-millis aren't represented by LocalDate & Instant anymore, between two projects. in my original one it was fine (where I have RC19), but now I'm on an older scala/sbt and they come out as "Long".
    jmgpeeters
    @jmgpeeters
    (worked around it for now with an explicit avroScalaSpecificCustomTypes in Compile := { avrohugger.format.SpecificRecord.defaultTypes.copy(timestampMillis = avrohugger.types.JavaTimeInstant)} in build.sbt, which seems to work fine)
    Julian Peeters
    @julianpeeters
    @julien-truffaut ah pardon. If you're using the sbt-avrohugger plugin, then here is an example https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/avrohugger/SpecificSerializationTests/build.sbt#L13, or using avrohugger directly would be something like
    val generator = Generator(Standard, avroScalaCustomTypes = myScalaTypes), or I'd be curious if you had something else in mind
    @jmgpeeters yes, support was dropped for sbt 0.13 around then, because cross-compiling for both 1.0.0 and 0.13 became infeasible. Glad you found a solution
    jmgpeeters
    @jmgpeeters
    OK, thanks. really cool project, by the way.
    danebert
    @danebert
    I've googled around and not found anything ... so I'm asking here :) ... I'm trying to produce a sources.jar for the generated code (using sbt-avrohugger and sbt-release) but the <schemaname>_2.12.sources.jar is empty (except for the META-INF/MANIFEST.MF) ... is there a configuration I can add that will fix this?
    Julian Peeters
    @julianpeeters
    @danebert I think the keyword you're looking for is src_managed. Avrohugger generates into that directory by default, in target, intended for source files managed totally by sbt (and thus are excluded from publishing as well). Overriding that is probably a question for sbt, but I wonder if it'll work for you if you configure Avrohugger's output to use a directory in src, e.g. https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/sbt-avrohugger/overridesettings/build.sbt#L12
    danebert
    @danebert
    I'm trying that out ... but I must be doing something wrong. I'm adding this to the build sbt but it is still putting the source in src_managed: avroScalaSource in Compile := new java.io.File(s"${baseDirectory.value}/src/main/scala"
    Julian Peeters
    @julianpeeters
    @danebert which format are you generating to? Maybe use avroSpecificScalaSource. https://github.com/julianpeeters/sbt-avrohugger#settings
    danebert
    @danebert
    ah ... that could be it ...
    I am using the 'specific'
    danebert
    @danebert
    yes ... that was it. Thanks fo your help!
    Julian Peeters
    @julianpeeters
    đź‘Ť
    jrciii
    @jrciii
    Im getting errors for "type": ["null", "string"]. What should I use instead?
    The error is that Unions beyond nullable fields are not supported
    But shouldnt this just turn into an Option[String]?
    jrciii
    @jrciii
    Never mind, it was actually the fact that my schemas also have some fields with a union type with only one field. Not sure why the team did that -_-;
    Jashan Goyal
    @jashangoyal09
    Hi Folks, I want to convert .avsc file to case class. For kafka streams serde.
    Thanks
    Julien Truffaut
    @julien-truffaut
    Hi all, by any chance do you have an example of using avro-hugger with kafka producer for a recent version of kafka client? e.g. > 2
    Roger Xu
    @rogerzxu
    Hi everyone, have a quick question I couldn't find the answer to: is there a way to configure the format of the generate sources?
    like adding newlines in between the fields, etc...
    Shivam Sharma
    @shivamsharma
    can we generate Java POJOs also using avrohugger?
    Floris Smit
    @florissmit10
    Hi, does anyone know how to override an "empty" namespace? My avro schemas are given and without a namespace
    Richard Gilmore
    @gilandose
    wondering for logical types is it possible to have them just as either Int or Long?
    Claudio Scandura
    @claudio-scandura
    Hi, does anybody know if Union types are supported when generating SpecificRecord classes? I get the following error: [error] (Compile / avroScalaGenerateSpecific) Unions beyond nullable fields are not supported, which seems to suggest the answer is no. Any idea?
    zachkirlew
    @zachkirlew

    Hey @julianpeeters avrohugger does not seem to support indirect recursion in schemas despite it being fine in avro. e.g.

    {
      "type": "record",
      "name": "A",
      "fields": [
        {
          "name": "B",
          "type": [
            "null",
            {
              "name": "C",
              "type": "record",
              "fields": [
                {
                  "name": "D",
                  "type": [
                    "null",
                    "A"
                  ]
                },
              ]
            }
          ]
        }
      ]
    }

    Avrohugger is throwing a key not found error.

    I did some investigating into the code and I noticed that problem is with the order of type registration and compilation. There is a SourceFormat.registerType method call for each “compilation unit”. I imagine the assumption was that as a topological search is being done first then it would be safe to do registration just before compiling each unit. However in order to support indirect recursion you would have to to register all types first before you can start compiling them.

    Would it be possible to change this in order to support indirect recursion?

    Thomas
    @thomasschoeftner
    hi folks - quick question:
    We need to split Avro models across different sbt projects.
    Is it possible with sbt & avro-hugger to use a model from an .avsc in one sbt sub-project, in an .avsc file in another sbt sub-project?
    With default settings, avro-hugger keeps complaining that the required type from the other project is not found.
    Thx a lot!
    Alexander Khotyanov
    @aksharp
    In case this was already discussed, is there a code example of what avro schema (avsc file contents) should look like to generate Scala ADT with sealed trait when running avroScalaGenerateSpecific?
    Anshul Bansal
    @anshulbansal2
    Hi Folks ,
    I need to add sbt plugin tht brings avrohugger core and other dependencies with 2.11 scala version instead of 2.12, currently addSbtPlugin("com.julianpeeters" % "sbt-avrohugger" % "2.0.0-RC22") is downloading 2.12 versions of libs
    Thomas
    @thomasschoeftner
    sry, I think this channel is dead... I haven't seen a response in months :(
    nicu marasoiu
    @nmarasoiu
    Hi guys, I am wondering how it would be possible, using IDL, avsc or other format, to reference type definitions from other types & make sure they are recognized... thanks a lot.. https://stackoverflow.com/questions/62535519/how-do-i-use-an-avro-type-in-another-avro-type-without-defining-it-again
    nicolaemarasoiu
    @nicolaemarasoiu
    Hi, we have a Kafka topic with values of Avro type union of two Avro types (any message is of one or another type). When we have union types for fields, we know how to work with them in the AvroHugger context, but when the union is top level, is there a way to define an ad hoc type that is simply the union of the two explicit types? Thank you
    PatrickEifler
    @PatrickEifler
    Hi, is there a way of generating case classes that are not final using SpecificRecord? Would like to use them to extend other case classes.
    John Halbert
    @johnhalbert
    Hi all! I'm using avrohugger-maven-plugin, and for some reason fields that should be type uuid are typed as String in generated scala code. Does anyone know how to resolve this so it's properly java.util.UUID?
    2 replies
    Yuming Wang
    @wangyum
    Hi, It seems protocol generated code can not compatible with Avro 1.8.2. Do we have a way to workaround this issue?
    This is because Spark only support Avro 1.8.2.
    beetlecrunch
    @beetlecrunch

    Hi, I am trying to generate an ADT from this schema using ScalaADT

    protocol Foo {
        record Bar {
            string a;
        }
    
        record Baz {
            string b;
        }
    }

    but what I get is

    /** MACHINE-GENERATED FROM AVRO SCHEMA. DO NOT EDIT DIRECTLY */
    final case class Bar(a: String)
    
    final case class Baz(b: String)

    I would have expected this

    /** MACHINE-GENERATED FROM AVRO SCHEMA. DO NOT EDIT DIRECTLY */
    sealed trait Foo
    
    final case class Bar(a: String) extends Foo
    
    final case class Baz(b: String) extends Foo

    Inside the build.sbt file I have inserted

    avroScalaSpecificCustomTypes in Compile := {
      avrohugger.format.Standard.defaultTypes.copy(
        protocol = avrohugger.types.ScalaADT)
    }
    
    sourceGenerators in Compile += (avroScalaGenerate in Compile).taskValue

    Did I omit something?
    Is the behavior I have described to you actually what one should expect?
    I apologize if it is a known problem and if it is just my misunderstanding, and I hope not to waste your time unnecessarily.

    Thank you

    Dmitriy Zakomirnyi
    @dmi3zkm
    Hey!
    I'm considering to try the lib in my project and it makes me wonder why no stable release since Dec 2017.