Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 10:05
    mergify[bot] labeled #248
  • 10:05
    mergify[bot] assigned #248
  • 10:04
    scala-steward opened #248
  • Jan 25 21:50

    mergify[bot] on master

    Update mockito-core to 4.3.1 (#… (compare)

  • Jan 25 21:50
    mergify[bot] closed #1312
  • Jan 25 21:32
    mergify[bot] labeled #1312
  • Jan 25 21:32
    mergify[bot] assigned #1312
  • Jan 25 21:31
    scala-steward opened #1312
  • Jan 24 17:26

    mergify[bot] on master

    Update mockito-core to 4.3.0 (#… (compare)

  • Jan 24 17:26
    mergify[bot] closed #1311
  • Jan 24 17:02
    mergify[bot] labeled #1311
  • Jan 24 17:02
    mergify[bot] assigned #1311
  • Jan 24 17:01
    scala-steward opened #1311
  • Jan 24 09:56

    mergify[bot] on master

    Update scalacheck-1-15 to 3.2.1… (compare)

  • Jan 24 09:56
    mergify[bot] closed #1310
  • Jan 24 09:38
    mergify[bot] synchronize #1310
  • Jan 24 09:38

    mergify[bot] on master

    Update scalatest to 3.2.11 (#24… (compare)

  • Jan 24 09:38
    mergify[bot] closed #247
  • Jan 24 09:37

    mergify[bot] on master

    Update scalatest to 3.2.11 (#13… (compare)

  • Jan 24 09:37
    mergify[bot] closed #1309
Linh Nguyen
@tuleism
Thanks!
Peter Zhong
@jiujiu1123
Hi Has anyone seem errors like this?
[error] Unspecified value parameter message.
[error]             __inner = Option(__inner.fold(_root_.scalapb.LiteParser.readMessage[livongo.microservice.util.filtering.Testing.FilterTarget.Nested](_input__))(_root_.scalapb.LiteParser.readMessage(_input__, _)))
[error]                                                                                                                                                 ^
[error] /Users/peter.zhong/Desktop/platform-libraries/microservice-util/target/scala-2.13/src_managed/test/scalapb/livongo/microservice/util/filtering/Testing/FilterTarget.scala:433:127: not enough arguments for method readMessage: (input: com.google.protobuf.CodedInputStream, message: livongo.microservice.util.filtering.Testing.FilterTarget.Nested)(implicit cmp: scalapb.GeneratedMessageCompanion[livongo.microservice.util.filtering.Testing.FilterTarget.Nested]): livongo.microservice.util.filtering.Testing.FilterTarget.Nested.
[error] Unspecified value parameter message.
[error]             __values += _root_.scalapb.LiteParser.readMessage[livongo.microservice.util.filtering.Testing.FilterTarget.Nested](_input__)
[error]
Nadav Samet
@thesamet
@jiujiu1123 I haven't seen, but can you check you are generating code using the same version of ScalaPB that you use to compile? (compilerplugin and scalapb-runtime versions need to match)
Peter Zhong
@jiujiu1123
Thanks let me verify that
Peter Zhong
@jiujiu1123
image.png
@thesamet could it be because the protoc-gen is a different minor version?
Nadav Samet
@thesamet
no, that's fine.
Looks like both compielrplugin and scalapb-runtime are on 0.10.x
Peter Zhong
@jiujiu1123
Yeah which is confusing since the versions should be matching.
Nadav Samet
@thesamet
I am not sure what the list in the screenshot represents, the compilerplugin is not a runtime dependency.
try to figure out why you patch versions are different
another possibility is that from some reason, you still have source code that generated by a different version of the compilerplugin that is configured right now. Try to perform a sbt clean, and have it regenerate.
Peter Zhong
@jiujiu1123
Let me try that. Really appreciate all the advice.
Peter Zhong
@jiujiu1123
I got it to work. Thank you so much!
Nadav Samet
@thesamet
:thumbsup:
Alexis BRENON
@AlexisBRENON
Hi. I try to use scalapb-spark, to read a dataset from Json. It seems to have a problem when a "repeated" field does not appear in the json stream...
In the "messageReads" method, the __fieldsMap.get(...) call returns Some(null)... Is it a known behavior ?
Alexis BRENON
@AlexisBRENON
For example, from this proto file:
syntax = "proto3";

package com.adyoulike.data.proto;

message MyRepeated {
  repeated string ListField = 1;
}
I can load data from this jsonl file:
{"ListField": ["foo", "bar"]}
{"ListField": []}

But this one

{"ListField": ["foo", "bar"]}
{"ListField": []}
{}

fails with

Exception in thread "main" java.lang.NullPointerException
    at com.adyoulike.data.proto.repeated.MyRepeated$.$anonfun$messageReads$4(MyRepeated.scala:84)
    at scala.Option.map(Option.scala:230)
    at com.adyoulike.data.proto.repeated.MyRepeated$.$anonfun$messageReads$1(MyRepeated.scala:84)

(that is to say the __fieldMap.get(...) line

I use an old version of spark and scalapb, but I cannot upgrade for the moment...
I'll try to workaround, using a custom type for my Seq: Option[Seq[...]]
1 reply
Alexis BRENON
@AlexisBRENON
Other question related to scalapb-spark. Is it possible to mix Encoders. I mean, I have a proto message Message. I define a case class case class Nesting(msg: Message, foo: Int). How can I provide an encoder for this case class ?
Nadav Samet
@thesamet
@AlexisBRENON can you give a code shippet on how you read the json and turn it into a case class? I couldn't tell if scalapb-json4s is involved.
1 reply
@AlexisBRENON Recent versions of sparksql-scalapb use frameless to derive encoders. See the tests for OuterCaseClass and OuterCaseClassTimestamp to see an example. For the second case class it uses frameless injections to customize the timestamp type: https://github.com/scalapb/sparksql-scalapb/blob/4017c0c2dea0e261e731547fbdee1202a2e61b16/sparksql-scalapb/src/test/scala/PersonSpec.scala#L34-L35
1 reply
Ángel Cervera Claudio
@angelcervera
Hi @thesamet I need to release a new version of osm4scala. At the moment, I'm using com.thesamet:sbt-protoc:0.99.34 and com.thesamet.scalapb:compilerplugin:0.10.2
Do you think that make sense to upgrade to latests versions? I'm looking for a changelog, but looks like it is not available, right?
Are new versions adding better performance and bug fixes or only new features (That I'm not using)?
Nadav Samet
@thesamet
Hi @angelcervera , the CHANGELOG is in https://github.com/scalapb/ScalaPB/blob/master/CHANGELOG.md . v0.11.x contains additional features, bug fixes, performance improvements, and Scala 3 support.
Ángel Cervera Claudio
@angelcervera
Great
Nadav Samet
@thesamet
@AlexisBRENON , I had a chance looking at it today. It looks like when columns are missing in the json input, spark puts in nulls for the missing value. ScalaPB expects empty arrays and not nulls when parsing a dataframe. I think it will be useful to have ScalaPB accept nulls when parsing spark dataframes. You could modify your input sources or prepare it for consumption by ScalaPB by replacing the nulls. If supporting nulls is something that would be useful for your project, feel free to file an issue with sparksql-scalapb.
5 replies
Nadav Samet
@thesamet
@AlexisBRENON A different approach to try would be to read the input source as String, then use ProtoSQL.udf to use scalapb-json4s to parse the json string into a message case class.
Josip Grgurica
@jkobejs
Hi, scalapb-json4s parser accepts both json name and original field name when deserializing json to proto message
scalapb-circe doesn't have support for that, you can by using preservingProtoFieldNames flag pick if you want to parse using original field or json name, but you cannot do both
I think that it makes sense to have the same behavior in scalapb-circe as in scalapb-json4s, it is easier to provide support for some legacy Json apis where you need to support original names together with new ones
let me know if you're ok with having it in scalapb-circe repo and I'll add support for it :)
Nadav Samet
@thesamet
Hi @jkobejs , scalapb-circe is maintained by @xuwei-k - I anticipate that a PR for it to catch up with scalapb-json4s behavior will be accepted. I suggest to file an issue on that repo.
Josip Grgurica
@jkobejs
Ok, tnx 🙂
Alexis BRENON
@AlexisBRENON

Hi. When defining an enum like

enum Foo {
  FOO_UNSET = 0;
  FOO_BAR = 1;
}

and compiling it with the option enum_strip_prefix, the resulting case objects are UNSET and BAR, but their names are still FOO_UNSET and FOO_BAR.
Is it the expected behavior. Is there a way to force the names to match the case objects ?

Nadav Samet
@thesamet
Alexis, enumValue.name returns the proto name of the enum. If you want the Scala name, use enumValue.scalaValueDescriptor.scalaName.
The scalaName may or may not be stripped based on the option enum_strip_prefix, it may also have other modifications to avoid naming conflicts with other Scala symbols to make sure it compiles.
Alexis BRENON
@AlexisBRENON
Thanks for the answer. But what I want is that the option enum_strip_prefix apply to both, enumValue.scalaValueDescriptor.scalaName AND enumValue.name.
Nadav Samet
@thesamet
Applying to enunValue.name will break the API contract (returning the proto name) and could break other libraries. Can you share more on the use case and why accessing through the descriptor is not sufficient?
1 reply
Petar Karadzhov
@karadzhov

Hello guys, I've been trying to make Scalapb (scala + validation) work together with buf in this example repo here.

I am running MacOS and the latest versions of the plugins (protoc-gen-scala and protoc-gen-scalapb-validate) are on my PATH.

Everything works fine until I want the validation to happen during the construction by setting the extension options in here which results in compilation failure like this one here.

Any ideas how to fix that would be much appreciated.

Nadav Samet
@thesamet
Hi @karadzhov , thanks for reporting and putting an example repo. Can you verify if the issue you found is related to buf or not? Would the same protos generate correctly through an sbt build?
Petar Karadzhov
@karadzhov

@thesamet I've added the validate/validate.proto manually to the repository and using plain protoc without buf has the same result.

It's also possible that I am not doing it right, however without the validation options it generates correct output.

Here is the exact command and its output that I used

Nadav Samet
@thesamet
@karadzhov Perfect. I'll take a look at it this evening. Which versions of scalapb, scalapb-validate and protoc are involved?
Petar Karadzhov
@karadzhov

@thesamet thank you very much!

Protoc: libprotoc 3.17.3
Scalapb: protoc-gen-scala-0.11.1-osx-x86_64
Scalapb validate: protoc-gen-scalapb-validate-0.3.2-unix

Nadav Samet
@thesamet

@karadzhov , I was able to reproduce the issue. You are using the native scalapb plugin which is built with Scala Native. The Java code that runs to parse the input protos relies on Java reflection. With Scala Native, all the reflective calls need to be known at compile time. To accomplish this in ScalaPB we rely on our test coverage to exercise all the code paths that could possibly trigger reflection. I added a new test for extensions of scalapb options in the same manner that scalapb-validate does: scalapb/ScalaPB@392a164 - this did the trick for the example you sent.

Can you try the snapshot release of protoc-gen-scala from the artifact section here: https://github.com/scalapb/ScalaPB/actions/runs/1410779035 and see if it fully solves your use case? From some strange reason, there would be another zip within the zip that you'll download from there.

Petar Karadzhov
@karadzhov

@thesamet thank you very much for your prompt response and solution. I've tried the snapshot artifact and can confirm that the compilation using protoc directly doesn't fail with the exception anymore.

I've tested the compilation using buf with different configurations and found out that when validate_at_construction and insert_validator_instance are both set to true there are warnings like this one "Warning: Duplicate generated file name "example/common/certificate/v1/CertificateSigningRequest.scala". Buf will continue without error here and drop the second occurrence of this file, but please raise an issue with the maintainer of the plugin." and the output is wrong.

Although this problem does not occur while using protoc directly I still wonder whether you would be able to solve the problem with buf by modifying the generator somehow.

Nadav Samet
@thesamet

Hi @karadzhov , this is a bug in buf that does not have a workaround on the ScalaPB side. The ScalaPB validator plugin uses an insertion feature of the protoc plugin protocol, so the validator plugin can insert validation code into the code generated by the core ScalaPB plugin. buf doesn't seem to support this, and I can file a bug with that project, so we can track this.

How do you invoke scalapb-validate through buf? Can you help me reproduce this part?

Nadav Samet
@thesamet
@karadzhov - I took another look and it seems that buf does support insertion points starting version 0.22, and there's a fix (not sure if relevant) coming in 1.0.0-RC3, see buf's changelog. Few things to look into: scalapb runs before scalapb-validate and that their output directories are the same.
Petar Karadzhov
@karadzhov
@thesamet gladly, here is the updated repo that can be used to reproduce the issue. I am using buf version 1.0.0-rc6 and the protoc-gen-* executables have to be on the path.
Nadav Samet
@thesamet
@karadzhov I was able to identify the buf issue that causes this and filed an issue with the project. We can track it there: bufbuild/buf#702
Petar Karadzhov
@karadzhov

@thesamet Thank you!

Please let me ask you about the final piece of my puzzle so far which is the gRPC.

I really would like to use fs2-grpc mainly because of the cats integration so that i don't need to wrap manually the futures everywhere. (btw is it intentional that the documentation has a link to a fork which is a bit behind from the upstream).

I assume it won't be as straightforward as scalapb-validate because I wasn't able to find any protoc-gen plugin built from the codegen. What would you advise?