Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Aug 11 07:43
    counter2015 opened #1398
  • Aug 08 16:44
    mergify[bot] labeled #1397
  • Aug 08 16:44
    mergify[bot] assigned #1397
  • Aug 08 16:43
    scala-steward opened #1397
  • Aug 06 16:38
    thesamet commented #1396
  • Aug 06 16:38

    thesamet on master

    Build for Scala Native, Scala 3… (compare)

  • Aug 06 16:38
    thesamet closed #1396
  • Aug 06 16:05
    thesamet commented #1395
  • Aug 06 16:03
    thesamet closed #1395
  • Aug 06 16:03

    thesamet on master

    remove sbt-dotty plugin in docs… (compare)

  • Aug 06 15:42
    jodersky opened #1396
  • Aug 04 23:42
    xuwei-k opened #1395
  • Aug 03 08:05
    tzbob commented #1394
  • Aug 02 17:29
    mergify[bot] labeled #1394
  • Aug 02 17:29
    mergify[bot] assigned #1394
  • Aug 02 17:28
    scala-steward closed #1389
  • Aug 02 17:28
    scala-steward commented #1389
  • Aug 02 17:28
    scala-steward opened #1394
  • Jul 31 18:01

    mergify[bot] on master

    Update scalatest to 3.2.13 (#27… (compare)

  • Jul 31 18:01
    mergify[bot] closed #276
Nadav Samet
@thesamet
You're welcome. It looks like the fs2-grpc PR just got merged. Hope the buf bug report will get addressed soon and you'll have a path forward.
Nadav Samet
@thesamet
@karadzhov ... and the buf issue got fixed too: bufbuild/buf#702
Petar Karadzhov
@karadzhov

@thesamet I am looking forward to the next buf release.

Meanwhile I've tried to find the fs2-grpc compiler plugin executable but without much success.

Scalapb-validate's is published alongside the jars however fs2-grpc's is not there. In case I am looking at the wrong artifact, would you please point me to the correct one. Thank you :)

Mike Limansky
@limansky
Hi all, could anybody clarify if protobuf 2.6.1 is still supported with latest scalapb? I've faced with protobuf compilation issue (I suppose it runs protoc with incorrect arguments). Should I create an issue for this?
Mike Limansky
@limansky
Here is the error message: Unknown flag: --jvm_0_opt.
Binh Nguyen
@ngbinh
@karadzhov buf looks great! Does it support scalapb out of the box?
Nadav Samet
@thesamet
@karadzhov The artifact name will be protoc-gen-fs2-grpc, it will get published with the next release of fs2-grpc. This project is maintained outside the ScalaPB org, please feel free to reach out to its maintainers to check when a release is expected or whether there are snapshot releases you can use in the meantime.
Hi @limansky protoc 2.6.1 is no longer supported.
Nadav Samet
@thesamet
@ngbinh ScalaPB ships standalone protoc plugins that buf or protoc can directly use. You'll need to download ScalaPB native plugin for Linux or OS X or ScalaPB JVM-based plugin and have it in your path, or somehow configure buf to use it.
Mike Limansky
@limansky
@thesamet thanks for reply. Maybe it should produce more specific message then. Which version should I use, if I'd like to use protobuf-java-2.6.1 (because I'm running inside of Spark environment).
Nadav Samet
@thesamet
@limansky Yes, maybe we can put some check on the value of protocVersion, but possibly there are uses of it that don't cause this error (if ScalaPB isn't invoked for example). For the specific use case you have, I suggest to use recent protoc and protobuf-java and use shading to avoid binary compatibility issues with Spark: https://scalapb.github.io/docs/sparksql#setting-up-your-project.
Mike Limansky
@limansky
@thesamet Thanks, that's might be an option.
Petar Karadzhov
@karadzhov

@thesamet I though it must be there already because that particular version was built after your PR got merged. I will try to reach out to the maintainers.

Regarding the Scalapb itself it would be much simpler for development if there was a Scalapb organization in the buf registry that publishes the proto files which later could be used like this instead of copying them manually.

@ngbinh yes you can find a simple example over here

Binh Nguyen
@ngbinh
@thesamet @karadzhov oh, thanks. Will try it out
Nadav Samet
@thesamet
@karadzhov I sent them an additional PR, hope I get it right this time: typelevel/fs2-grpc#434
Nadav Samet
@thesamet
Looks like it would require some additional work on my end. I'll update the PR above later today or tomorrow.
I am not sure what's the way to get the scalapb plugin published in the buf.build org. Ideally they have something that allows them to pull the versions I already publish. Feel free to check with them.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Thank, I reproduce it.
However, from my point of view, the PR should be done the scalapb compiler. The NPE happens in Person.messageReads (which is generated by the scalapb compiler).
Or can it be handled anywhere else ? In the ProtoSQL ?
Alexis BRENON
@brenon.alexis:matrix.org
[m]

Maybe I found a way to fix it.

diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala    (revision 28d88a90150c9242ee9e3a57ff3b86da6f233ae9)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala    (date 1636475671326)
@@ -69,8 +69,13 @@
       input: Expression
   ): Expression = {
     if (fd.isRepeated && !fd.isMapField) {
+      val nonNullInput = If(
+        IsNull(input),
+        input,
+        Literal.fromObject(Array.empty[String])
+      )
       val objs = MapObjects(
-        (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, input),
+        (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, nonNullInput),
         input,
         protoSql.singularDataType(fd)
       )

This does not work "as-is". I am not very fluent with Catalyst ^^'
But this should be a good start.

Nadav Samet
@thesamet
That's in the right direction. Though we can't assume it's an array of strings there. Not sure if it matters.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hum, yeah, actually we need to look inside to get the singularDataType of fd.
Nadav Samet
@thesamet
actually, this function converts Catalyst to Scala objects. So erasure works in our advantage. Probably Array.empty[Object] will work.
Nadav Samet
@thesamet
I think that the nonNullInput in the code snippet above is applied to elements within the array, not to the array itself.
Goal is either to produce an empty array, or omit the key-value pair if the value is null from the resulting map.
Nadav Samet
@thesamet
@brenon.alexis:matrix.org I wonder if the only change needed is to just drop null values for _._2 in mkMap in JavaHelpers.scala
Petar Karadzhov
@karadzhov
@thesamet thank you!
Alexis BRENON
@brenon.alexis:matrix.org
[m]

@thesamet: This seems to work.

diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala    (revision 07217938678b93309fb2a148f181e673fbfef21d)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala    (date 1636493180889)
@@ -82,7 +82,10 @@
   def mkMap(cmp: GeneratedMessageCompanion[_], args: ArrayData): Map[FieldDescriptor, PValue] = {
     cmp.scalaDescriptor.fields
       .zip(args.array)
-      .filterNot(_._2 == PEmpty)
+      .filter {
+        case (_, null) | (_, PEmpty) => false
+        case _ => true
+      }
       .toMap
       .asInstanceOf[Map[FieldDescriptor, PValue]]
   }

Too late to open the PR, I will do it tomorrow. Are you OK with the pattern matching or do you prefer another code construction.
Thanks for the advice. Hope it will solve my problem (I will test it on my production code tomorrow).

Nadav Samet
@thesamet
@brenon.alexis:matrix.org This diff looks fine. Looking forward for the PR
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Damn it. We're still using Spark 2.4.
I need to backport this PR to scalapb-sparksql v0.10.4...
But mkMap does not seems to exist in this version...
Alexis BRENON
@brenon.alexis:matrix.org
[m]
This seems to work too. scalapb/sparksql-scalapb@5b46381
I don't know if you are interested to merge it too? In this case, you have to create a branch for the 0.10.5 release on which I can open a PR.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
However, I still have an error with my production spark job... I'll try to build a minimal reproducible test.

It happens with a highly nested message, but on a root field which is a repeated nested message.

message Auction {
  Request request...
  reapeated Response responses...
  ...
}

If responses is null it fails. But it seems to handle the other null-repeated fields properly.

But it seems to handle properly the addresses field of the Person message. So, I don't know what can cause this error...
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hum... My bad, I probably have some clash between default spark.implicits._ and the protoSql.implicits._
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Not sure if adding a new option to have the old behavior (which throws an NPE) is desirable. I think it would be best to add the fix without the flag on both branches.
(But let me know if I am missing any consideration why someone is likely to want the current behavior, going for simplicity)
Nadav Samet
@thesamet
Alexis BRENON
@brenon.alexis:matrix.org
[m]
I choose of adding an option to avoid to break the current behavior. But actually, I don't know if anybody will need it.
And in fact, the protobuf guide says that default values will be omitted in JSON-encoded data, and that missing or null values will be interpreted as default value, and the default value of repeated fields is empty.
So actually, this PR is a fix to make it more standard compliant. I am going to update it to remove this option.
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Thank you!
Alexis BRENON
@brenon.alexis:matrix.org
[m]
I just updated the PR.
Petar Karadzhov
@karadzhov
@thesamet I gave it a try and unfortunately it fails with an exception about missing main method - to reproduce it please use the updated repository
Ilya
@squadgazzz

Hey there!
Akka GRPC generates scala classes from proto files like this:

trait MyServicePowerApi extends MyService {
    def foo(in: MyRequest, metadata: Metadata)
    def foo(in: MyRequest) = throw new GrpcServiceException(Status.UNIMPLEMENTED)
}

Is it possible to configure ScalaPB somehow to generate the following code?

trait MyServicePowerApi extends MyService {
    def foo(in: MyRequest, metadata: Metadata)
    def foo(in: MyRequest) = foo(in, new GrpcMetadataImpl(new io.grpc.Metadata()))
}
Nadav Samet
@thesamet
@karadzhov I should have tested the build more - I assumed it was working before and just needed publishing... I sent a new PR: typelevel/fs2-grpc#440
@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.
Passing different metadata is done by creating new stubs. This is a lightweight operation since the same RPC channel can be shared by any number of stubs.
DevilOps
@Davitron

Hi all, i'm trying to run sbt clean protocGenerate on a macbook apple silicon M1. I get the following error

[error] lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe: not found: https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe

I have added PB.protocVersion := "3.17.3" to my build.sbt but it still fails. How do I fix this?

Nadav Samet
@thesamet
@Davitron One of the reasons I can think of is that you are in a multi-project build, and the setting is not applied in a specific sub-project.