by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Aug 10 01:27

    thesamet on master

    docs: update social image (compare)

  • Aug 10 01:10

    thesamet on master

    docs: update social image (compare)

  • Aug 10 00:50

    thesamet on master

    docs: add image to theme (compare)

  • Aug 09 17:35

    thesamet on master

    Update formatting (compare)

  • Aug 09 17:28

    thesamet on master

    docs: Minor whitespace edits (compare)

  • Aug 09 07:25

    mergify[bot] on master

    Update sbt-buildinfo to 0.10.0 … (compare)

  • Aug 09 07:25
    mergify[bot] closed #48
  • Aug 09 07:21
    mergify[bot] labeled #48
  • Aug 09 07:21
    mergify[bot] assigned #48
  • Aug 09 07:21
    scala-steward opened #48
  • Aug 09 06:59

    thesamet on master

    Add validation documentation (compare)

  • Aug 09 02:49

    thesamet on master

    Add "writing protoc plugins" do… (compare)

  • Aug 09 00:15

    thesamet on master

    Update sbt-native-packager to 1… (compare)

  • Aug 09 00:15
    thesamet closed #874
  • Aug 08 20:31

    thesamet on master

    Update coursier to 2.0.0-RC6-24… (compare)

  • Aug 08 20:31
    thesamet closed #888
  • Aug 08 14:42

    thesamet on master

    Add welcome message (compare)

  • Aug 08 01:05

    thesamet on master

    Update sbt-mdoc to 2.2.4 (#894) (compare)

  • Aug 08 01:05
    thesamet closed #894
  • Aug 08 01:05

    dependabot[bot] on npm_and_yarn

    (compare)

Nadav Samet
@thesamet
@chenharryhua Got it. Maybe some extra functions can be added somewhere to accept ScalaPB messages. Check into the source of the current function, if all it needs is access to the message's Descriptor, ScalaPB can provide that without conversion to Java.
Harry Chen
@chenharryhua
@thesamet thanks for your help. by the way, I am also trying DynamicMessage which may avoid the double java/scala code generation problem. thanks for the fantastic lib/plugin. I'd like to share my code but github is not stable today
besides, implicit def messageCompanion in companion object is very thoughtful. :)
Antonio
@AntonioYuen
Hi all, in my attempt to upgrade to Spark 3.0.0, I noticed that scalaPB fails, specifically when invoking parseFrom it fails with this error java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.encoders.ExpressionEncoder. Could this be because there is a dependency on spark 2.4.x here https://github.com/scalapb/ScalaPB/blob/ca8633268f7011a883c83c0c062ea21df7fba314/build.sbt#L304?
I get this error when I send in my resulting encoders directly java.lang.UnsupportedOperationException: No Encoder found for com.google.protobuf.ByteString
Seth Tomy
@SethTomy_twitter

In maven when using runtime 0.9.0-M5 I'm getting the following error on compilation.

error: object creation impossible, since method bindService in class ServiceCompanion of type ... , executionContext: scala.concurrent.ExecutionContext)io.grpc.ServerServiceDefinition is not defined

When I switch to 0.10.7 I get

error: class Any needs to be a trait to be mixed in ...

I saw a post above where the runtime version needs to match something else but idk what that something else is.

Nadav Samet
@thesamet
@AntonioYuen Spark 3 support is being tracked in scalapb/sparksql-scalapb#97. It's currently blocked on Frameless support being complete.
Antonio
@AntonioYuen
@thesamet great! thanks for the info
Nadav Samet
@thesamet
@SethTomy_twitter The versions of compilerplugin, scalapb-runtime-grpc need to have the same major and minor versions as the generated code. When you switch to 0.10.x, did you regenerate the code (running sbt clean would trigger regeneration)
Leif Battermann
@battermann

Hi, I have the following gRPC definition (simplified):

service DataService {
    rpc ListOrders (Empty) returns (stream Order) {}
    rpc ListTrades (Empty) returns (stream Trade) {}
    rpc ListFilteredOrders (ListOrdersRequest) returns (stream Order) {}
    rpc ListFilteredTrades (ListTradesRequest) returns (stream Trade) {}
    rpc GetStatus (Empty) returns (stream ServiceStatusResponse) {}
}

I can successfully generate all the scala code. However the generated code looks nothing like the example from the documentation. (https://scalapb.github.io/grpc.html)

I want to use the client. But there is no stub method on the generated DataServiceProto object. What am I doing wrong?

[error] Main.scala:13:31: value stub is not a member of object data_service.DataServiceProto
[error]   val stub = DataServiceProto.stub(channel)
[error]                               ^
[error] one error found
Leif Battermann
@battermann

Solved it. I need to add grpc = true here like this:

PB.targets in Compile := Seq(
  scalapb.gen(grpc = true) -> (sourceManaged in Compile).value / "scalapb"
)

I thought the default was true.

Nadav Samet
@thesamet
@battermann The default should be true, so what you're describing is unexpected. If you can file a bug with exact instruction (minimal project ideally) that demonstrates the problem, it would be great.
Leif Battermann
@battermann
I will do that
Alec Zorab
@aleczorab_twitter

I'm trying to ressurect the grpcmonix plugin and update it to 10.1, I'm seeing something really odd that I can't quite get my head around

in build.sbt

lazy val proto = (project in file("proto"))
  .settings(
    scalaVersion := "2.13.3",
    libraryDependencies ++= Seq(
      "io.grpc" % "grpc-netty" % scalapb.compiler.Version.grpcJavaVersion,
      "io.grpc" % "grpc-services" % scalapb.compiler.Version.grpcJavaVersion,
      "com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapb.compiler.Version.scalapbVersion,
      "com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf",
    ),
    PB.targets in Compile := Seq(
      PB.gens.java -> (sourceManaged in Compile).value,
      scalapb.gen(javaConversions = true) -> (sourceManaged in Compile).value,
      grpcmonix.generators.GrpcMonixGenerator() -> (sourceManaged in Compile).value,
    )
  )

in plugins.sbt

addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.28")

libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.10.1"
// libraryDependencies += "azorab" %% "grpcmonixgenerator" % "0.0.8-SNAPSHOT"

I get error: not found: value grpcmonix, as I'd expect.

If I uncomment out the bottom line of plugins.sbt, I get error: object gen is not a member of package scalapb and error: not found: value javaConversions, which confuses me a lot. Has anyone got any suggestions before I go and hassle the sbt guys with the same question?

Alec Zorab
@aleczorab_twitter
It seems marking "com.thesamet.scalapb" %% "compilerplugin" as "provided" in the grpcmonix build makes it work, so there's obviously something odd going on with the fact it's coming in once directly and again as a dependency
Seth Tomy
@SethTomy_twitter
@thesamet thank you! My minor versions weren't matching up, sorry about the easy one.
Leif Battermann
@battermann
@thesamet done
Leif Battermann
@battermann
how can I set custom headers for requests?
Leif Battermann
@battermann
I think I got it.
  val header = {
    val h = new Metadata()
    h.put(
      Metadata.Key.of("Authorization", Metadata.ASCII_STRING_MARSHALLER),
      "Basic Zm9vYmFyOmZvb2Jhci1wYXNz"
    )
    h
  }

  val stubWithHeader = MetadataUtils.attachHeaders(DataServiceGrpc.stub(channel), header)
sharonsyra
@Sharonsyra

Hey all,
"message":"Unable to parse the event wrapper: com.google.protobuf.InvalidProtocolBufferException: While parsing a protocol message, the input ended unexpectedly in the middle of a field. This could mean either that the input has been truncated or that an embedded message misreported its own length.

Been getting this error. The message has two fields of type com.google.protobuf.any.Any. I would like to parse the ArrayByte but keeps failing due to this. There is no place length is being set.

Please assist. Thanks in advance

Nadav Samet
@thesamet
Hi @Sharonsyra , the error says that the array of bytes is not in the same format that is expected for this message. Most likely the array of bytes was not produced by serializing a message of the same type (by calling its toByteArray method)
Louis
@LouisJB
hi, if on a different (windows) server I get a failure with a non-zero exit code from protoc how to debug that or figure out the cause?
Jakub Liska
@l15k4
Hey guys, I very often need to generate client code for some kubernetes backed which usually has tons and tons of imports ... It is very hard to collect all the proto files to a jar, is there any tool for that ? Something that would recursively download all the "dependencies" in a proto file and put them into a single directory where the JAR could be created
Jakub Liska
@l15k4
ok, I gave it more thought and I think it is impossible ... the dep tree is often enormous in the golang world
Louis
@LouisJB

hi, if on a different (windows) server I get a failure with a non-zero exit code from protoc how to debug that or figure out the cause?

I think it might be different runtime dlls between the machines - is there a way to enable more verbose protoc output (debug/trace) so the output might be shown in sbt output, rather than it just showing me a bizzare exit code (which is a nonstandard error code, I think it is a windows handle of the last code executed before it died)

Nadav Samet
@thesamet
@l15k4 maybe you can write a tool that does this by traversing the go directory tree? Or if you follow how another build system (like go's) assemble the proto files into a directory, and rely on that to generate from sbt.
Nadav Samet
@thesamet
@LouisJB To further isolate the problem, do you get this non-zero exit code from protoc also when ScalaPB plugin is not included, for example - try to only generate java.
IIRC, you are on an old version, so you might be hitting a bug that has already been fixed. You could try using a new version just for the sake of compiling on that machine, and if you identify the version that the fix was introduced, we could try to back port it.
skarma
@skarma_gitlab
Hi!
I'm trying to write a program that deserialize protobuf3 into Scala. However, I'm not sure about how to write it...
I would like to parse the file below into Scala.
syntax = "proto3";

message Person {
  string name = 1;
  int32 age = 2;
}
Nadav Samet
@thesamet
Hi @skarma_gitlab , have you followed the basic installation described here: https://scalapb.github.io/sbt-settings.html#basic-installation ?
Then, create the file you paste under src/main/protobuf/person.proto.
After you compile your project, you'll be able to call a method person.Person.parseFrom(bytes) which takes an array of bytes and gives you back a Person case class.
skarma
@skarma_gitlab
@thesamet
Thank you! I managed to resolved the problem!
Alexandre Montecucco
@alexmontecucco

Hello,
we have the following project structure (using sbt multi-projects builds):

  • build.sbt
  • projectA
  • projectB
  • git_proto_submodule (containing all protobuf files)
    • pkg1
      • obj1.proto
    • pkg2
      • obj2.proto

Let’s suppose that:

  • projectA needs pkg1/obj1.proto
  • pkg1/obj1.proto imports pkg2/obj2.proto

I would like for projectA to only specify the dependency on pkg1/obj1.proto.
Using sbt-protoc, I would do something like in config:

  • PB.includePaths in Compile ++= Seq(file("git_proto_submodule/“))
  • PB.protoSources in Compile := Seq(file("git_proto_submodule/pkg1/obj1.proto")

However, if I do this, only the obj1 scala code is generated. The code for the imported protobuf files is not generated, so compilation will fail.

Is there a simple way to achieve this?

Nadav Samet
@thesamet
This message was deleted
@alexmontecucco protoSources controls which files to generate, and since obj1 needs obj2 you need to generate code for both. You could set PB.protoSources to "git_proto_submodule", so both files are generated without explicitly listing any of them.
Nadav Samet
@thesamet
Another pattern is to create another subproject with the git_proto_submodule, and in projectA use dependsOn(gitProtoSubmodule).
Alexandre Montecucco
@alexmontecucco
@thesamet at the moment we actually use projectA dependsOn(gitProtoSubmodule). The issues are that the generated code is fairly big and that it depend on all the protos (which makes the final deployment bigger).
In golang, for instance, that would not be an issue, because the granularity to build the binary is the package whereas in java it is the jar (we would then need proguard to remove unnecessary proto classes, which comes with its own set of issues)
Nadav Samet
@thesamet
@alexmontecucco Got it. So it sounds like the requirement is to only generate Scala code for the minimal subset of scala files out of git_proto_submodule that would make projectA compile?
Currently, the only way to accomplish that would be to manually list the files you need in git_proto_submodule.
Perhaps you can find way to automatically build that list by combining a few command line tools. For example, protoc has differnet flags that might help like --dependency_out or --descriptor_set_out
Alexandre Montecucco
@alexmontecucco
@thesamet Thank you for your answer.
The current ScalaPB behaviour makes sense, I just wanted to make sure I had not missed anything.
It looks like dependency_out is a simple way to build the whole recursive imports list. Thank you for the tip :)
Louis
@LouisJB

@LouisJB To further isolate the problem, do you get this non-zero exit code from protoc also when ScalaPB plugin is not included, for example - try to only generate java.

Hi, moving go version with embedded protoc not the old where it was external does seem to have resolved the problem on that specific machine, interestingly. Which is good that that is an upgrade I will merge to main code soon. So must be some executable or dependent lib issue between platforms, I suppose.

Boyd Stephen Smith Jr.
@bss03_gitlab
Is there an example / preferred "pluginArtifact" for POMs that need to work on both Linux and MS Windows?
For the Java / Google stuff you can use ${os.detected.classifier} from the build extension kr.motd.maven:os-maven-plugin, and the build works on both Docker and other developers laptops.
I'm not sure how to choose between bat:windows and sh:unix artifacts based on where the build is running. (Am needing to do gRPC in addition to protobuf.)
Nadav Samet
@thesamet
@bss03_gitlab I wasn't able to create a single executable that would work for both Windows and Linux. Perhaps with graalvm I can build a native windows exe file, and then {os.detected.classifier} can be used, and the extension would be the same.
Nadav Samet
@thesamet

@bss03_gitlab I can think of a few solutions:

  1. Build a native plugin for windows, like we do with graalvm for os x and linux: https://github.com/scalapb/ScalaPB/blob/49a3b59ae811b7e3bd7f286e44dc77efcd575f21/.github/workflows/release.yml#L65-L100
  2. Help resolve xolstice/protobuf-maven-plugin#56 - perhaps in a fork.
  3. add a new grpcto ScalaPB as a package-scoped option, so it can be enabled by adding another proto file instead of an additional plugin parameter: https://scalapb.github.io/docs/customizations#package-scoped-options - you'll also have to add scalapb.proto to your project.

I think out of all options (3) would be easiest

though it will require a new ScalaPB release that supports this funcitonality.