Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jun 25 18:32
    benthecarman commented #1368
  • Jun 25 18:27
    benthecarman commented #1368
  • Jun 25 18:27
    benthecarman commented #1368
  • Jun 25 13:49
    thesamet commented #1368
  • Jun 25 13:46
    thesamet commented #1368
  • Jun 25 07:35
    benthecarman commented #1368
  • Jun 25 07:14
    benthecarman commented #1368
  • Jun 24 19:49

    thesamet on master

    Update ScalaPB version to 0.11.… (compare)

  • Jun 11 15:28
    thesamet commented #1375
  • Jun 05 04:07

    thesamet on master

    Add Ahmad Ragab as sponsor to R… (compare)

  • Jun 01 22:32
    dependabot[bot] labeled #1377
  • Jun 01 22:32
    dependabot[bot] opened #1377
  • Jun 01 22:32

    dependabot[bot] on npm_and_yarn

    Bump eventsource from 1.1.0 to … (compare)

  • May 25 15:19

    thesamet on master

    Update CHANGELOG.md for 0.11.11… (compare)

  • May 25 15:18

    thesamet on v0.11.11

    (compare)

  • May 24 20:21

    mergify[bot] on master

    Update sbt-sonatype to 3.9.13 (… (compare)

  • May 24 20:21
    mergify[bot] closed #268
  • May 24 20:13

    mergify[bot] on master

    Update sbt-sonatype to 3.9.13 (… (compare)

  • May 24 20:13
    mergify[bot] closed #263
  • May 24 20:11

    mergify[bot] on master

    Update sbt-sonatype to 3.9.13 (… (compare)

Nadav Samet
@thesamet
Alexis, enumValue.name returns the proto name of the enum. If you want the Scala name, use enumValue.scalaValueDescriptor.scalaName.
The scalaName may or may not be stripped based on the option enum_strip_prefix, it may also have other modifications to avoid naming conflicts with other Scala symbols to make sure it compiles.
Alexis BRENON
@AlexisBRENON
Thanks for the answer. But what I want is that the option enum_strip_prefix apply to both, enumValue.scalaValueDescriptor.scalaName AND enumValue.name.
Nadav Samet
@thesamet
Applying to enunValue.name will break the API contract (returning the proto name) and could break other libraries. Can you share more on the use case and why accessing through the descriptor is not sufficient?
1 reply
Petar Karadzhov
@karadzhov

Hello guys, I've been trying to make Scalapb (scala + validation) work together with buf in this example repo here.

I am running MacOS and the latest versions of the plugins (protoc-gen-scala and protoc-gen-scalapb-validate) are on my PATH.

Everything works fine until I want the validation to happen during the construction by setting the extension options in here which results in compilation failure like this one here.

Any ideas how to fix that would be much appreciated.

Nadav Samet
@thesamet
Hi @karadzhov , thanks for reporting and putting an example repo. Can you verify if the issue you found is related to buf or not? Would the same protos generate correctly through an sbt build?
Petar Karadzhov
@karadzhov

@thesamet I've added the validate/validate.proto manually to the repository and using plain protoc without buf has the same result.

It's also possible that I am not doing it right, however without the validation options it generates correct output.

Here is the exact command and its output that I used

Nadav Samet
@thesamet
@karadzhov Perfect. I'll take a look at it this evening. Which versions of scalapb, scalapb-validate and protoc are involved?
Petar Karadzhov
@karadzhov

@thesamet thank you very much!

Protoc: libprotoc 3.17.3
Scalapb: protoc-gen-scala-0.11.1-osx-x86_64
Scalapb validate: protoc-gen-scalapb-validate-0.3.2-unix

Nadav Samet
@thesamet

@karadzhov , I was able to reproduce the issue. You are using the native scalapb plugin which is built with Scala Native. The Java code that runs to parse the input protos relies on Java reflection. With Scala Native, all the reflective calls need to be known at compile time. To accomplish this in ScalaPB we rely on our test coverage to exercise all the code paths that could possibly trigger reflection. I added a new test for extensions of scalapb options in the same manner that scalapb-validate does: scalapb/ScalaPB@392a164 - this did the trick for the example you sent.

Can you try the snapshot release of protoc-gen-scala from the artifact section here: https://github.com/scalapb/ScalaPB/actions/runs/1410779035 and see if it fully solves your use case? From some strange reason, there would be another zip within the zip that you'll download from there.

Petar Karadzhov
@karadzhov

@thesamet thank you very much for your prompt response and solution. I've tried the snapshot artifact and can confirm that the compilation using protoc directly doesn't fail with the exception anymore.

I've tested the compilation using buf with different configurations and found out that when validate_at_construction and insert_validator_instance are both set to true there are warnings like this one "Warning: Duplicate generated file name "example/common/certificate/v1/CertificateSigningRequest.scala". Buf will continue without error here and drop the second occurrence of this file, but please raise an issue with the maintainer of the plugin." and the output is wrong.

Although this problem does not occur while using protoc directly I still wonder whether you would be able to solve the problem with buf by modifying the generator somehow.

Nadav Samet
@thesamet

Hi @karadzhov , this is a bug in buf that does not have a workaround on the ScalaPB side. The ScalaPB validator plugin uses an insertion feature of the protoc plugin protocol, so the validator plugin can insert validation code into the code generated by the core ScalaPB plugin. buf doesn't seem to support this, and I can file a bug with that project, so we can track this.

How do you invoke scalapb-validate through buf? Can you help me reproduce this part?

Nadav Samet
@thesamet
@karadzhov - I took another look and it seems that buf does support insertion points starting version 0.22, and there's a fix (not sure if relevant) coming in 1.0.0-RC3, see buf's changelog. Few things to look into: scalapb runs before scalapb-validate and that their output directories are the same.
Petar Karadzhov
@karadzhov
@thesamet gladly, here is the updated repo that can be used to reproduce the issue. I am using buf version 1.0.0-rc6 and the protoc-gen-* executables have to be on the path.
Nadav Samet
@thesamet
@karadzhov I was able to identify the buf issue that causes this and filed an issue with the project. We can track it there: bufbuild/buf#702
Petar Karadzhov
@karadzhov

@thesamet Thank you!

Please let me ask you about the final piece of my puzzle so far which is the gRPC.

I really would like to use fs2-grpc mainly because of the cats integration so that i don't need to wrap manually the futures everywhere. (btw is it intentional that the documentation has a link to a fork which is a bit behind from the upstream).

I assume it won't be as straightforward as scalapb-validate because I wasn't able to find any protoc-gen plugin built from the codegen. What would you advise?

Nadav Samet
@thesamet
@karadzhov I sent fs2-grpc a pull request to enable publishing a standalone plugin: typelevel/fs2-grpc#433 . It looks like it was intentionally disabled at some point in the past. I also updated the links to fs2-grpc in ScalaPB's docs. The location there was the original fs2-grpc repo. The project moved to typelevel ealier this year.
Petar Karadzhov
@karadzhov
@thesamet you are truly awesome! Thank you once more for enabling us to use a language which we love with tools that do not support it "natively".
Nadav Samet
@thesamet
You're welcome. It looks like the fs2-grpc PR just got merged. Hope the buf bug report will get addressed soon and you'll have a path forward.
Nadav Samet
@thesamet
@karadzhov ... and the buf issue got fixed too: bufbuild/buf#702
Petar Karadzhov
@karadzhov

@thesamet I am looking forward to the next buf release.

Meanwhile I've tried to find the fs2-grpc compiler plugin executable but without much success.

Scalapb-validate's is published alongside the jars however fs2-grpc's is not there. In case I am looking at the wrong artifact, would you please point me to the correct one. Thank you :)

Mike Limansky
@limansky
Hi all, could anybody clarify if protobuf 2.6.1 is still supported with latest scalapb? I've faced with protobuf compilation issue (I suppose it runs protoc with incorrect arguments). Should I create an issue for this?
Mike Limansky
@limansky
Here is the error message: Unknown flag: --jvm_0_opt.
Binh Nguyen
@ngbinh
@karadzhov buf looks great! Does it support scalapb out of the box?
Nadav Samet
@thesamet
@karadzhov The artifact name will be protoc-gen-fs2-grpc, it will get published with the next release of fs2-grpc. This project is maintained outside the ScalaPB org, please feel free to reach out to its maintainers to check when a release is expected or whether there are snapshot releases you can use in the meantime.
Hi @limansky protoc 2.6.1 is no longer supported.
Nadav Samet
@thesamet
@ngbinh ScalaPB ships standalone protoc plugins that buf or protoc can directly use. You'll need to download ScalaPB native plugin for Linux or OS X or ScalaPB JVM-based plugin and have it in your path, or somehow configure buf to use it.
Mike Limansky
@limansky
@thesamet thanks for reply. Maybe it should produce more specific message then. Which version should I use, if I'd like to use protobuf-java-2.6.1 (because I'm running inside of Spark environment).
Nadav Samet
@thesamet
@limansky Yes, maybe we can put some check on the value of protocVersion, but possibly there are uses of it that don't cause this error (if ScalaPB isn't invoked for example). For the specific use case you have, I suggest to use recent protoc and protobuf-java and use shading to avoid binary compatibility issues with Spark: https://scalapb.github.io/docs/sparksql#setting-up-your-project.
Mike Limansky
@limansky
@thesamet Thanks, that's might be an option.
Petar Karadzhov
@karadzhov

@thesamet I though it must be there already because that particular version was built after your PR got merged. I will try to reach out to the maintainers.

Regarding the Scalapb itself it would be much simpler for development if there was a Scalapb organization in the buf registry that publishes the proto files which later could be used like this instead of copying them manually.

@ngbinh yes you can find a simple example over here

Binh Nguyen
@ngbinh
@thesamet @karadzhov oh, thanks. Will try it out
Nadav Samet
@thesamet
@karadzhov I sent them an additional PR, hope I get it right this time: typelevel/fs2-grpc#434
Nadav Samet
@thesamet
Looks like it would require some additional work on my end. I'll update the PR above later today or tomorrow.
I am not sure what's the way to get the scalapb plugin published in the buf.build org. Ideally they have something that allows them to pull the versions I already publish. Feel free to check with them.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Thank, I reproduce it.
However, from my point of view, the PR should be done the scalapb compiler. The NPE happens in Person.messageReads (which is generated by the scalapb compiler).
Or can it be handled anywhere else ? In the ProtoSQL ?
Alexis BRENON
@brenon.alexis:matrix.org
[m]

Maybe I found a way to fix it.

diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala    (revision 28d88a90150c9242ee9e3a57ff3b86da6f233ae9)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala    (date 1636475671326)
@@ -69,8 +69,13 @@
       input: Expression
   ): Expression = {
     if (fd.isRepeated && !fd.isMapField) {
+      val nonNullInput = If(
+        IsNull(input),
+        input,
+        Literal.fromObject(Array.empty[String])
+      )
       val objs = MapObjects(
-        (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, input),
+        (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, nonNullInput),
         input,
         protoSql.singularDataType(fd)
       )

This does not work "as-is". I am not very fluent with Catalyst ^^'
But this should be a good start.

Nadav Samet
@thesamet
That's in the right direction. Though we can't assume it's an array of strings there. Not sure if it matters.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hum, yeah, actually we need to look inside to get the singularDataType of fd.
Nadav Samet
@thesamet
actually, this function converts Catalyst to Scala objects. So erasure works in our advantage. Probably Array.empty[Object] will work.
Nadav Samet
@thesamet
I think that the nonNullInput in the code snippet above is applied to elements within the array, not to the array itself.
Goal is either to produce an empty array, or omit the key-value pair if the value is null from the resulting map.
Nadav Samet
@thesamet
@brenon.alexis:matrix.org I wonder if the only change needed is to just drop null values for _._2 in mkMap in JavaHelpers.scala
Petar Karadzhov
@karadzhov
@thesamet thank you!
Alexis BRENON
@brenon.alexis:matrix.org
[m]

@thesamet: This seems to work.

diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala    (revision 07217938678b93309fb2a148f181e673fbfef21d)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala    (date 1636493180889)
@@ -82,7 +82,10 @@
   def mkMap(cmp: GeneratedMessageCompanion[_], args: ArrayData): Map[FieldDescriptor, PValue] = {
     cmp.scalaDescriptor.fields
       .zip(args.array)
-      .filterNot(_._2 == PEmpty)
+      .filter {
+        case (_, null) | (_, PEmpty) => false
+        case _ => true
+      }
       .toMap
       .asInstanceOf[Map[FieldDescriptor, PValue]]
   }

Too late to open the PR, I will do it tomorrow. Are you OK with the pattern matching or do you prefer another code construction.
Thanks for the advice. Hope it will solve my problem (I will test it on my production code tomorrow).

Nadav Samet
@thesamet
@brenon.alexis:matrix.org This diff looks fine. Looking forward for the PR
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Damn it. We're still using Spark 2.4.
I need to backport this PR to scalapb-sparksql v0.10.4...
But mkMap does not seems to exist in this version...
Alexis BRENON
@brenon.alexis:matrix.org
[m]
This seems to work too. scalapb/sparksql-scalapb@5b46381
I don't know if you are interested to merge it too? In this case, you have to create a branch for the 0.10.5 release on which I can open a PR.