Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Simon Parten
    @Quafadas
    @jeanjerome I had this the other day, the library can do it, but it sometimes get's it's error messaging wrong was my conclusion. Have you tried adding types to the implicit RW and making sure they are in companion objects?
    Jean-Jerome
    @jeanjerome
    @Quafadas Thanks for your response. I finally tried traversing JSON as explain in https://www.lihaoyi.com/post/HowtoworkwithJSONinScala.html#traversing-json
    Siddhant Sanyam
    @siddhant3s
    case class Foo(bar: String) { 
      val baz = "inner"
      def dbaz = "definner"
    }
    Is there any away to support getting baz and dbaz serialized along with Foo(bar)?
    Li Haoyi
    @lihaoyi
    you'd have to write a custom serializer using upickle.readwriter[ujson.Value].bimap[Foo](..., ...)
    Siddhant Sanyam
    @siddhant3s
    Okay. Thanks I'll give it a try.
    MrMuppet-cut
    @MrMuppet-cut
    Hi! I am trying to read a json string using ujson:
    This is the string:
    MrMuppet-cut
    @MrMuppet-cut
    {
      "dataflows": [
        {
          "name": "test",
          "sources": [
           {
              "name": "person_inputs",
              "path": "/data/input/events/person/*",
              "format": "JSON"
            }
          ],
          "transformations": [
            {
              "name": "validation",
              "type": "validate_fields",
              "params": {
                "input": "person_inputs",
                "validations": [
                  {
                    "field": "office",
                    "validations": [
                     "notEmpty"
                    ]
                  },
                  {
                    "field": "age",
                    "validations": [
                      "notNull"
                    ]
                  }
                ]
              }
            },
            {
              "name": "ok_with_date",
              "type": "add_fields",
              "params": {
                "input": "validation_ok",
                "addFields": [
                  {
                    "name": "dt",
                    "function": "current_timestamp"
    
                  }
                ]
              }
            }
          ],
          "sinks": [
            {
              "input": "ok_with_date",
              "name": "raw-ok",
              "paths": [
                "/data/output/events/person"
              ],
              "format": "JSON",
              "saveMode": "OVERWRITE"
            },
            {
              "input": "validation_ko",
              "name": "raw-ko",
              "paths": [
                "/data/output/discards/person"
              ],
              "format": "JSON",
              "saveMode": "OVERWRITE"
            }
          ]
        }
      ]
    }
    which is in a file called "metadata.json"
    and I read just by doing:
    val j = os.read(os.pwd/RelPath("src/main/scala/metadata.json"))
    val jsonData = ujson.read(j)
    however, when I try
    jsonData(0)
    I get
    ujson.Value$InvalidData: Expected ujson.Arr (data: {"dataflows":[{"name":"test","sources":[{"name":"person_inputs","path":"/data/input/events/person/*","format":"JSON"}],"transformations":[{"name":"validation","type":"validate_fields","params":{"input":"person_inputs","validations":[{"field":"office","validations":["notEmpty"]},{"field":"age","validations":["notNull"]}]}},{"name":"ok_with_date","type":"add_fields","params":{"input":"validation_ok","addFields":[{"name":"dt","function":"current_timestamp"}]}}],"sinks":[{"input":"ok_with_date","name":"raw-ok","paths":["/data/output/events/person"],"format":"JSON","saveMode":"OVERWRITE"},{"input":"validation_ko","name":"raw-ko","paths":["/data/output/discards/person"],"format":"JSON","saveMode":"OVERWRITE"}]}]})
    MrMuppet-cut
    @MrMuppet-cut
    Any idea why this happens?
    Any help would be greatly appreciated. Thanks!!
    Cheng Wang
    @polarker
    Hi, is there a way to fail on excess JSON fields ? This feature is useful when working with optional fields
    Doug Roper
    @htmldoug
    Not easily. The most straightforward way probably requires modifying the macro.
    Next best would be to either convert to Value and count the fields or write a Visitor.Delegate to wrap the Reader and count them that way.
    Igor Romanov
    @pharod

    I have in my case class:

    @key("stage_name") stageName: Option[String] = None,

    and when this property is missing from JSON I am getting an error message:

    Caused by: upickle.core.AbortException: missing keys in dictionary: stage_name, state_metadata at index ...

    What is wrong?

    Igor Romanov
    @pharod
    @lihaoyi It seems it works ok only when I'm not using @key annotation for mapping. Looks like a bug in uPickle or a limitation?
    Li Haoyi
    @lihaoyi
    huh seems like a bug
    can you minimize a repro and open a ticket
    Igor Romanov
    @pharod
    @lihaoyi I've filed issue com-lihaoyi/upickle#360
    objektwerks
    @objektwerks
    Using Scala 2.13.6 and uPickle 1.4.0, I assume everyone is familiar with this macroRW error:
    match may not be exhaustive. It would fail on the following input: (x: Int forSome x not in 0)
    Currently I use an @nowarn annotation to ignore all such errors. Is there an expected fix to this in a forthcoming version of Scala and/or uPickle? Or is this with us for a long time to come?
    Doug Roper
    @htmldoug
    Should be fixed in https://github.com/com-lihaoyi/upickle/compare/1.4.0...master. Just needs a release.
    objektwerks
    @objektwerks
    @htmldoug Thanks for the update, Doug!
    Lorenzo Gabriele
    @lolgab

    Hi! :wave:
    Does anyone want to review this PR I opened to fix the CI (it wasn't running the tests): com-lihaoyi/upickle#359 ?

    Thanks!

    Then we can trigger a release @htmldoug @objektwerks

    objektwerks
    @objektwerks
    @lolgab Thanks for the update, Lorenzo. Looking forward to the new release.:)
    Lorenzo Gabriele
    @lolgab
    @objektwerks @htmldoug UPickle 1.4.1 is now going to Maven Central: https://github.com/com-lihaoyi/upickle/runs/3514651632
    objektwerks
    @objektwerks
    @lolgab Awesome!
    objektwerks
    @objektwerks
    Just upgraded to upickle 1.4.1 - and no more @nowarn annotation required! Thanks, team uPickle!
    Lorenzo Gabriele
    @lolgab
    Thank you to @htmldoug who made the fix! :)
    Happy having fixed the CI so now we can finally merge PRs confidently!
    Siddhant Sanyam
    @siddhant3s
    I'm trying to use upickle to extract my case classs schema. The visitor that uPickle provide is great. I would like to have a visitor on the type if you know what I mean:
    1. I have user defined case class Foo(..)
    2. I want a JSON deserializer for that class. That's easy, I can just use macroRW
    3. I also want to codegen Foo into a Python class. For that, it would be nice if I could re-use any of uPickle's machinery to have a Visitor on the schema of Foo. That way I'll be able to generate the Python code string easily
    Li Haoyi
    @lihaoyi
    i dont thinkvthe visitor is helpful for codegen. Visitor is called on the runtime data structure, not on the definition structure. Only macroRW has that; you could try forking it to generate the code directly or to generate metadata you can use for the codegen
    Siddhant Sanyam
    @siddhant3s
    Do you think it's possible to use the Reader[T] generated by macroRW[T] to something like that? I mean I can write my own macro, but I was hoping if I can re-use something from your macro so that I can be assured that 2. and 3. are in sync.
    Li Haoyi
    @lihaoyi
    no, the macro generated read/writers explicitly avoid generating inspectable data structures to avoid performance overhead. It makes them fast, but it means you cannot really repurpose them for anything else
    i could imagine coming up with some way to store a lazily instantiated type description on each readwriter, so others can use it, but that's not what happens now
    Siddhant Sanyam
    @siddhant3s
    Thanks for confirming. I'll handwrite it in that case!
    Siddhant Sanyam
    @siddhant3s

    @lihaoyi I've been going through your wonderful tutorial on Visitor pattern yet again. I want to modify it slightly to include a cache based early stop for recursive Data structure. Think of an ADT which is self referencing. With the pattern in the tutorial, there is no way for visitor to communicate back to dispatcher to stop going further.
    So I came up with this approach, could you please comment on what you think about it:

        trait Visitor[T] {
            def visitPrimitive(tpe: Type): T
            def visitOption(innerVisited: T): T
            def visitArrayType(innerVisited: T): T
            def visitMapType(keyVisited: T, valueVisited: T): T
            def visitStructType(tpe: Type): StructVisitor[T]
            def stopValue: Option[T] = None // If defined, we stop visiting and simply return this value
        }
        trait StructVisitor[T] {
            def visitFieldKey(name: String): Visitor[T]
            def visitFiledType(name: String, visitedValue: T): Unit
            def done: T
            def stopValue: Option[T] = None  // If defined, we stop visiting and simply return this value
        }

    This is a visitor to visit Type from scala.reflect

    Not 100% happy with it, so was wondering if you can share some thought on how to improve it.
    Li Haoyi
    @lihaoyi
    I don't have any real insight here, depends on how well it fits your use case
    I always find visitors pretty finnicky to think about, so I always end up with a lot of "shake it around until it compiles, shake it some more until tests pass" kind of workflows
    Yeitijem
    @Yeitijem
    Hi is there some documentation about using upickle in conjunction with scala 3? And if yes, where can I find it?