Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Sep 17 15:44
    thesamet commented #1407
  • Sep 17 15:27

    thesamet on master

    Fix code example in gRPC doc (#… (compare)

  • Sep 17 15:27
    thesamet closed #1407
  • Sep 17 14:59
    takezoe opened #1407
  • Sep 16 05:30
    pra91 commented #278
  • Sep 14 15:38
    bit-void closed #178
  • Sep 14 06:25
    ndeverge commented #37
  • Sep 13 15:22
    cdecoux commented #278
  • Sep 12 12:55
    dnmfarrell commented #282
  • Sep 12 12:55
    dnmfarrell commented #282
  • Sep 12 02:28
    dnmfarrell commented #282
  • Sep 12 02:27
    dnmfarrell commented #282
  • Sep 12 01:50
    dnmfarrell commented #282
  • Sep 12 01:47
    dnmfarrell edited #282
  • Sep 11 17:45
    mergify[bot] labeled #18
  • Sep 11 17:45
    mergify[bot] assigned #18
  • Sep 11 17:45
    scala-steward opened #18
  • Sep 09 21:14
    dnmfarrell edited #282
  • Sep 09 20:30
    dnmfarrell commented #282
  • Sep 09 20:29
    dnmfarrell opened #282
Alexis BRENON
@brenon.alexis:matrix.org
[m]
However, I still have an error with my production spark job... I'll try to build a minimal reproducible test.

It happens with a highly nested message, but on a root field which is a repeated nested message.

message Auction {
  Request request...
  reapeated Response responses...
  ...
}

If responses is null it fails. But it seems to handle the other null-repeated fields properly.

But it seems to handle properly the addresses field of the Person message. So, I don't know what can cause this error...
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hum... My bad, I probably have some clash between default spark.implicits._ and the protoSql.implicits._
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Not sure if adding a new option to have the old behavior (which throws an NPE) is desirable. I think it would be best to add the fix without the flag on both branches.
(But let me know if I am missing any consideration why someone is likely to want the current behavior, going for simplicity)
Nadav Samet
@thesamet
Alexis BRENON
@brenon.alexis:matrix.org
[m]
I choose of adding an option to avoid to break the current behavior. But actually, I don't know if anybody will need it.
And in fact, the protobuf guide says that default values will be omitted in JSON-encoded data, and that missing or null values will be interpreted as default value, and the default value of repeated fields is empty.
So actually, this PR is a fix to make it more standard compliant. I am going to update it to remove this option.
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Thank you!
Alexis BRENON
@brenon.alexis:matrix.org
[m]
I just updated the PR.
Petar Karadzhov
@karadzhov
@thesamet I gave it a try and unfortunately it fails with an exception about missing main method - to reproduce it please use the updated repository
Ilya
@squadgazzz

Hey there!
Akka GRPC generates scala classes from proto files like this:

trait MyServicePowerApi extends MyService {
    def foo(in: MyRequest, metadata: Metadata)
    def foo(in: MyRequest) = throw new GrpcServiceException(Status.UNIMPLEMENTED)
}

Is it possible to configure ScalaPB somehow to generate the following code?

trait MyServicePowerApi extends MyService {
    def foo(in: MyRequest, metadata: Metadata)
    def foo(in: MyRequest) = foo(in, new GrpcMetadataImpl(new io.grpc.Metadata()))
}
Nadav Samet
@thesamet
@karadzhov I should have tested the build more - I assumed it was working before and just needed publishing... I sent a new PR: typelevel/fs2-grpc#440
@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.
Passing different metadata is done by creating new stubs. This is a lightweight operation since the same RPC channel can be shared by any number of stubs.
DevilOps
@Davitron

Hi all, i'm trying to run sbt clean protocGenerate on a macbook apple silicon M1. I get the following error

[error] lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe: not found: https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe

I have added PB.protocVersion := "3.17.3" to my build.sbt but it still fails. How do I fix this?

Nadav Samet
@thesamet
@Davitron One of the reasons I can think of is that you are in a multi-project build, and the setting is not applied in a specific sub-project.
Ilya
@squadgazzz

@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.

What kind of stubs do you mean?

@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.

and why did you say about different metadata? it's the same

Nadav Samet
@thesamet
Stubs are referring to the generated interface for clients. It looks like the intent of your question was to ask how to pass different metadata to different client calls. The way to do this is to create a client, and then:
client.withInterceptors(MetadataUtils.newAttachHeadersInterceptor(metadata))
Nadav Samet
@thesamet
@brenon.alexis:matrix.org Merged your PR for sparksql-scalapb. If you need this fixed for sparksql-scalapb 0.10.x, I created a new branch 0.10.x that can be used as a target for backporting the fix.
1 reply
Petar Karadzhov
@karadzhov
@thesamet I gave it a try and there weren't any problems so far. I will try to make an actual client and server out of it soon just to verify that it's fine also during runtime and let you know. Thank you once more!
Alexis BRENON
@brenon.alexis:matrix.org
[m]
Hi. Thanks for the merges.
Can I expect a v0.10.5 release soon? Do I need to perform any action?
Nadav Samet
@thesamet
@brenon.alexis:matrix.org I've just released 0.10.5.
Dave Kichler
@dkichler

Hi there, just recently attempted to update one of my projects from sbt 1.4.9 to 1.5.5 and am now unable to resolve the binary protoc-gen-validate dependencies:

[error] (client / update) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.protoc-plugin: not found: https://repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.protoc-plugin

it seems to be appending the wrong extension (.protoc-plugin vs .exe), the same dependency resolved via sbt 1.4.9:

sbt:project> show client/protobuf:managedClasspath
...
[info] * Attributed(/Users/dk/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.exe)

Curious if anyone can explain the different extension? or suggest a workaround/fix

Nadav Samet
@thesamet
@dkichler can you file an issue with the exact steps to reproduce? A minimal example will be great.
Dave Kichler
@dkichler
Hi @thesamet - it turns out the issue only manifests in a cross-module dependency through .dependsOn(). I've put together this minimal example that reproduces the issue: https://github.com/dkichler/protoc-plugin-resolution-issue
Could very well be an sbt issue, but figured I'd start here in case anything pops out at you.
Dave Kichler
@dkichler
Happy to file an issue with more details if you think it belongs in one of the scalapb related repos
Nadav Samet
@thesamet
Thanks for providing the test repo @dkichler . I'll take a look soon. In the mean time, can you bump up the version of sbt to latest (1.5.5) and sbt-protoc to 1.0.4. It's possible that the bug you are seeing has been resolved upstream.
2 replies
Kyle Leeners
@kyle.leeners_gitlab
Hi there, looking for some guidance. We've got a proto file that is riddled with scalapb logic. We'd like to pull that file into a java service (through maven) and generate a java client. Currently the file does not contain any references to java. I'm wondering if the file can be imported and the java extra bits be tacked on? Things like java_outer_classname, java_package, etc. Are those even necessary?
Alternatively, I see grpc docs have native support for java generation. Any idea if the scala pb stuff will impact that generation? We have messages like so:
message Location {
    option (scalapb.message).no_box = true;
    string city = 1 [(scalapb.field).no_box = true];
    ...
}
Nadav Samet
@thesamet
Hi @kyle.leeners_gitlab , when you generate Java code from the proto, the scalapb options will not affect it, and setting java-specific options won't affect the Scala source generation.
Kyle Leeners
@kyle.leeners_gitlab
awesome, thanks! I'll start to go down that road
Kyle Leeners
@kyle.leeners_gitlab

Hello, another question. I've got a proto file that I'd like to generate as both scala and java. The file imports another proto file (sourced though sbt). I'm running into issues during the generation because the imported file doesn't have any of the java conversion options.

Is there a way to add / decorate the java conversion options? I don't have direct access to the file, only the ability to import it.

Here's the file i'm trying to pull in for reference

syntax = "proto3";

import "scalapb/scalapb.proto";

option (scalapb.options) = {
  single_file: true
  lenses: true
  retain_source_code_info: true
  flat_package: true
  no_default_values_in_constructor: true,
  package_name: "foo"
};

message PackageUid {
  option (scalapb.message).extends = "bar";
  option (scalapb.message).companion_extends = "baz";

  string uid = 1;
}
Nadav Samet
@thesamet
@kyle.leeners_gitlab you can use package-scoped options to enable java_conversions for an entire package: https://scalapb.github.io/docs/customizations/#package-scoped-options - the option to set is java_conversions: true
Kyle Leeners
@kyle.leeners_gitlab
i dont have the ability to define a package name for the PackageUid.proto. Is there some default that i can hook into? Or would it just be foo?
Nadav Samet
@thesamet
@kyle.leeners_gitlab The thing is that if you don't have package or java_package in that file, the generated Java code doesn't have a package statement. I can't recall the details, but this leads to a problem where the generated Java code can't be accessed from Scala, adn there isn't much that can be done on the ScalaPB side for this.
Kyle Leeners
@kyle.leeners_gitlab
okay good to know. Maybe I can bug some people to add a package statement
Nadav Samet
@thesamet
sounds like a pretty reasonable thing to have.
Nadav Samet
@thesamet
@dkichler I wasn't able to reproduce the plugin resolution either on Linux nor Mac. I think it's a local issue. However, there were a number of issues with the build itself. I sent a PR https://github.com/dkichler/protoc-plugin-resolution-issue/pull/1/files with various comments explaining the issues.
1 reply
Matt Davis
@Matt-S6
hi! Is this the right place to ask a scalapb-sparksql usage question? In particular I'm looking for a way to understand some scala implicit resolution on a GeneratedMessage.
Nadav Samet
@thesamet
Hi @Matt-S6 , yes, this is the best place for these questions! :)
Matt Davis
@Matt-S6
Yay! ok, @thesamet, I got the sparksql-scalapb-test project to reproduce the issue. Here's the draft PR: thesamet/sparksql-scalapb-test#5
1 reply
Nadav Samet
@thesamet
@Matt-S6 Yes, you're really close. The error message tells you that you need to provide an implicit Encoder for your type. In L118, try adding a second implicit parameter, encoder: org.apache.spark.sql.Encoder[A].
Another way to write it, since you don't need direct access to the implicit values and they are just being passed through:
def fromRawEventDS[A <: GeneratedMessage : GeneratedMessageCompanion : Encoder](
    ds: Dataset[RawEvent]
  ) = ...
7 replies
João Ferreira
@jtjeferreira
Hi. I am an happy user of scalapb. Thanks for this amazing project. However recently I had the need to use java grpc in another project and was really frustrated I could not find how to define custom types like in scalapb as described here https://scalapb.github.io/docs/customizations#custom-types. By luck, do you know if this a feature is available in Java? Or only scalapb implements this and it is not common in other languages? Or my google skills are failing me?
Nadav Samet
@thesamet
Hey @jtjeferreira , sorry have missed your question from last week - thanks for the feedback! Custom types is a unique feature of ScalaPB. The standard implementations do not offer this flexibility.
João Ferreira
@jtjeferreira
Thanks @thesamet