Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Aug 11 07:43
    counter2015 opened #1398
  • Aug 08 16:44
    mergify[bot] labeled #1397
  • Aug 08 16:44
    mergify[bot] assigned #1397
  • Aug 08 16:43
    scala-steward opened #1397
  • Aug 06 16:38
    thesamet commented #1396
  • Aug 06 16:38

    thesamet on master

    Build for Scala Native, Scala 3… (compare)

  • Aug 06 16:38
    thesamet closed #1396
  • Aug 06 16:05
    thesamet commented #1395
  • Aug 06 16:03
    thesamet closed #1395
  • Aug 06 16:03

    thesamet on master

    remove sbt-dotty plugin in docs… (compare)

  • Aug 06 15:42
    jodersky opened #1396
  • Aug 04 23:42
    xuwei-k opened #1395
  • Aug 03 08:05
    tzbob commented #1394
  • Aug 02 17:29
    mergify[bot] labeled #1394
  • Aug 02 17:29
    mergify[bot] assigned #1394
  • Aug 02 17:28
    scala-steward closed #1389
  • Aug 02 17:28
    scala-steward commented #1389
  • Aug 02 17:28
    scala-steward opened #1394
  • Jul 31 18:01

    mergify[bot] on master

    Update scalatest to 3.2.13 (#27… (compare)

  • Jul 31 18:01
    mergify[bot] closed #276
Mike Limansky
@limansky
@thesamet Thanks, that's might be an option.
Petar Karadzhov
@karadzhov

@thesamet I though it must be there already because that particular version was built after your PR got merged. I will try to reach out to the maintainers.

Regarding the Scalapb itself it would be much simpler for development if there was a Scalapb organization in the buf registry that publishes the proto files which later could be used like this instead of copying them manually.

@ngbinh yes you can find a simple example over here

Binh Nguyen
@ngbinh
@thesamet @karadzhov oh, thanks. Will try it out
Nadav Samet
@thesamet
@karadzhov I sent them an additional PR, hope I get it right this time: typelevel/fs2-grpc#434
Nadav Samet
@thesamet
Looks like it would require some additional work on my end. I'll update the PR above later today or tomorrow.
I am not sure what's the way to get the scalapb plugin published in the buf.build org. Ideally they have something that allows them to pull the versions I already publish. Feel free to check with them.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Thank, I reproduce it.
However, from my point of view, the PR should be done the scalapb compiler. The NPE happens in Person.messageReads (which is generated by the scalapb compiler).
Or can it be handled anywhere else ? In the ProtoSQL ?
Alexis BRENON
@brenon.alexis:matrix.org
[m]

Maybe I found a way to fix it.

diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala    (revision 28d88a90150c9242ee9e3a57ff3b86da6f233ae9)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala    (date 1636475671326)
@@ -69,8 +69,13 @@
       input: Expression
   ): Expression = {
     if (fd.isRepeated && !fd.isMapField) {
+      val nonNullInput = If(
+        IsNull(input),
+        input,
+        Literal.fromObject(Array.empty[String])
+      )
       val objs = MapObjects(
-        (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, input),
+        (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, nonNullInput),
         input,
         protoSql.singularDataType(fd)
       )

This does not work "as-is". I am not very fluent with Catalyst ^^'
But this should be a good start.

Nadav Samet
@thesamet
That's in the right direction. Though we can't assume it's an array of strings there. Not sure if it matters.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hum, yeah, actually we need to look inside to get the singularDataType of fd.
Nadav Samet
@thesamet
actually, this function converts Catalyst to Scala objects. So erasure works in our advantage. Probably Array.empty[Object] will work.
Nadav Samet
@thesamet
I think that the nonNullInput in the code snippet above is applied to elements within the array, not to the array itself.
Goal is either to produce an empty array, or omit the key-value pair if the value is null from the resulting map.
Nadav Samet
@thesamet
@brenon.alexis:matrix.org I wonder if the only change needed is to just drop null values for _._2 in mkMap in JavaHelpers.scala
Petar Karadzhov
@karadzhov
@thesamet thank you!
Alexis BRENON
@brenon.alexis:matrix.org
[m]

@thesamet: This seems to work.

diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala    (revision 07217938678b93309fb2a148f181e673fbfef21d)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala    (date 1636493180889)
@@ -82,7 +82,10 @@
   def mkMap(cmp: GeneratedMessageCompanion[_], args: ArrayData): Map[FieldDescriptor, PValue] = {
     cmp.scalaDescriptor.fields
       .zip(args.array)
-      .filterNot(_._2 == PEmpty)
+      .filter {
+        case (_, null) | (_, PEmpty) => false
+        case _ => true
+      }
       .toMap
       .asInstanceOf[Map[FieldDescriptor, PValue]]
   }

Too late to open the PR, I will do it tomorrow. Are you OK with the pattern matching or do you prefer another code construction.
Thanks for the advice. Hope it will solve my problem (I will test it on my production code tomorrow).

Nadav Samet
@thesamet
@brenon.alexis:matrix.org This diff looks fine. Looking forward for the PR
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Damn it. We're still using Spark 2.4.
I need to backport this PR to scalapb-sparksql v0.10.4...
But mkMap does not seems to exist in this version...
Alexis BRENON
@brenon.alexis:matrix.org
[m]
This seems to work too. scalapb/sparksql-scalapb@5b46381
I don't know if you are interested to merge it too? In this case, you have to create a branch for the 0.10.5 release on which I can open a PR.
Alexis BRENON
@brenon.alexis:matrix.org
[m]
However, I still have an error with my production spark job... I'll try to build a minimal reproducible test.

It happens with a highly nested message, but on a root field which is a repeated nested message.

message Auction {
  Request request...
  reapeated Response responses...
  ...
}

If responses is null it fails. But it seems to handle the other null-repeated fields properly.

But it seems to handle properly the addresses field of the Person message. So, I don't know what can cause this error...
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hum... My bad, I probably have some clash between default spark.implicits._ and the protoSql.implicits._
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Not sure if adding a new option to have the old behavior (which throws an NPE) is desirable. I think it would be best to add the fix without the flag on both branches.
(But let me know if I am missing any consideration why someone is likely to want the current behavior, going for simplicity)
Nadav Samet
@thesamet
Alexis BRENON
@brenon.alexis:matrix.org
[m]
I choose of adding an option to avoid to break the current behavior. But actually, I don't know if anybody will need it.
And in fact, the protobuf guide says that default values will be omitted in JSON-encoded data, and that missing or null values will be interpreted as default value, and the default value of repeated fields is empty.
So actually, this PR is a fix to make it more standard compliant. I am going to update it to remove this option.
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Thank you!
Alexis BRENON
@brenon.alexis:matrix.org
[m]
I just updated the PR.
Petar Karadzhov
@karadzhov
@thesamet I gave it a try and unfortunately it fails with an exception about missing main method - to reproduce it please use the updated repository
Ilya
@squadgazzz

Hey there!
Akka GRPC generates scala classes from proto files like this:

trait MyServicePowerApi extends MyService {
    def foo(in: MyRequest, metadata: Metadata)
    def foo(in: MyRequest) = throw new GrpcServiceException(Status.UNIMPLEMENTED)
}

Is it possible to configure ScalaPB somehow to generate the following code?

trait MyServicePowerApi extends MyService {
    def foo(in: MyRequest, metadata: Metadata)
    def foo(in: MyRequest) = foo(in, new GrpcMetadataImpl(new io.grpc.Metadata()))
}
Nadav Samet
@thesamet
@karadzhov I should have tested the build more - I assumed it was working before and just needed publishing... I sent a new PR: typelevel/fs2-grpc#440
@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.
Passing different metadata is done by creating new stubs. This is a lightweight operation since the same RPC channel can be shared by any number of stubs.
DevilOps
@Davitron

Hi all, i'm trying to run sbt clean protocGenerate on a macbook apple silicon M1. I get the following error

[error] lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe: not found: https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe

I have added PB.protocVersion := "3.17.3" to my build.sbt but it still fails. How do I fix this?

Nadav Samet
@thesamet
@Davitron One of the reasons I can think of is that you are in a multi-project build, and the setting is not applied in a specific sub-project.
Ilya
@squadgazzz

@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.

What kind of stubs do you mean?

@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.

and why did you say about different metadata? it's the same

Nadav Samet
@thesamet
Stubs are referring to the generated interface for clients. It looks like the intent of your question was to ask how to pass different metadata to different client calls. The way to do this is to create a client, and then:
client.withInterceptors(MetadataUtils.newAttachHeadersInterceptor(metadata))
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Merged your PR for sparksql-scalapb. If you need this fixed for sparksql-scalapb 0.10.x, I created a new branch 0.10.x that can be used as a target for backporting the fix.
1 reply
Petar Karadzhov
@karadzhov
@thesamet I gave it a try and there weren't any problems so far. I will try to make an actual client and server out of it soon just to verify that it's fine also during runtime and let you know. Thank you once more!
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hi. Thanks for the merges.
Can I expect a v0.10.5 release soon? Do I need to perform any action?
Nadav Samet
@thesamet
@brenon.alexis:matrix.org I've just released 0.10.5.
Dave Kichler
@dkichler

Hi there, just recently attempted to update one of my projects from sbt 1.4.9 to 1.5.5 and am now unable to resolve the binary protoc-gen-validate dependencies:

[error] (client / update) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.protoc-plugin: not found: https://repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.protoc-plugin

it seems to be appending the wrong extension (.protoc-plugin vs .exe), the same dependency resolved via sbt 1.4.9:

sbt:project> show client/protobuf:managedClasspath
...
[info] * Attributed(/Users/dk/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.exe)

Curious if anyone can explain the different extension? or suggest a workaround/fix

Nadav Samet
@thesamet
@dkichler can you file an issue with the exact steps to reproduce? A minimal example will be great.
Dave Kichler
@dkichler
Hi @thesamet - it turns out the issue only manifests in a cross-module dependency through .dependsOn(). I've put together this minimal example that reproduces the issue: https://github.com/dkichler/protoc-plugin-resolution-issue
Could very well be an sbt issue, but figured I'd start here in case anything pops out at you.
Dave Kichler
@dkichler
Happy to file an issue with more details if you think it belongs in one of the scalapb related repos