Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Mar 03 23:21
    joagain closed #1080
  • Mar 03 23:21
    joagain commented #1080
  • Mar 03 22:55
    thesamet commented #1080
  • Mar 03 21:57
    joagain opened #1080
  • Mar 03 01:00

    thesamet on 0.10.x

    Return error message when seale… (compare)

  • Mar 03 00:58

    thesamet on master

    Return error message when seale… (compare)

  • Mar 02 19:19
    thesamet commented #1079
  • Mar 02 18:53
    aksharp commented #1079
  • Mar 02 18:49
    aksharp commented #1079
  • Mar 02 18:36
    thesamet commented #1079
  • Mar 02 18:03
    thesamet commented #1079
  • Mar 02 17:51
    thesamet commented #1079
  • Mar 02 17:51
    thesamet commented #1079
  • Mar 02 17:50
    aksharp commented #1079
  • Mar 02 17:45
    aksharp opened #1079
  • Mar 02 12:09

    mergify[bot] on master

    Update scalatest to 3.2.5 (#164… (compare)

  • Mar 02 12:09
    mergify[bot] closed #164
  • Mar 02 12:05
    scala-steward synchronize #164
  • Mar 02 10:10

    mergify[bot] on master

    Update spark-sql to 3.1.1 (#168) (compare)

  • Mar 02 10:10
    mergify[bot] closed #168
Nadav Samet
@thesamet
@MaggieLeber , I don't use mill. It would be useful to set up a a minimal example project to demonstrate this problem.
and getName
Timothy Klim
@TimothyKlim

Hello, @thesamet!
Is there any issues with scalapb/scalapb-grpcweb#79 ? I've tried to match error by status like I would do in grpc-java https://grpc.github.io/grpc-java/javadoc/io/grpc/StatusRuntimeException.html but in scala.js version the status field is private.

Btw, thanks for the awesome library! It's easy to have autowire like library to handle 2.13, 3.0 and scala.js versions.

Nadav Samet
@thesamet
Hey @TimothyKlim , I reviewed it earlier but forgot to publish my comment.
grpcweb is meant to keep the same signatures io.grpc has, so the feedback is to mimic the original method names, keep status private, and introduce getStatus()
Timothy Klim
@TimothyKlim

grpcweb is meant to keep the same signatures io.grpc has, so the feedback is to mimic the original method names, keep status private, and introduce getStatus()

I've updated PR

Nadav Samet
@thesamet
merged!
Timothy Klim
@TimothyKlim
Thanks!
Maggie Leber
@MaggieLeber
@thesamet I figured that might be helpful, and the full set of protos is proprietary anyway.
As time permits, I'll create a cut-down exemplar with mill and get it to you. Next few days will be busy, as I was in a minor auto accident yesterday (nobody hurt) and we currently have a blizzard here.
Nadav Samet
@thesamet
@MaggieLeber no worries. Take care!
Thib
@Horneth
I was looking at https://github.com/scalapb/zio-grpc/blob/master/docs/decorating.md which mentions transformers can be used to pre/post-process requests/responses. The example shows post processing of responses but I fail to see how it can be used to pre-process requests ?
Nadav Samet
@thesamet
Hi @Horneth , you can use transformContextM to preprocess a request, see https://scalapb.github.io/zio-grpc/docs/context#context-transformations
Thib
@Horneth
Can I also access the request message itself or only the RequestContext ?
Nadav Samet
@thesamet
No, it's not necessarily received at the time RequestContext is instantiated.
This is the case for client streaming - you get a request, but the client stream may only bring the first message later
Bartosz Bąbol
@BBartosz
Hey im using scalapb json4s 0.9.3. My scalapb version is 0.9.7. Im using scalapb json4s without any problems with scala 2.12.8 but when i update scala to 2.12.10 i got exception on one of the classes:
An exception or error caused a run to abort: com.google.protobuf.wrappers.DoubleValue$.messageCompanion()Lscalapb/GeneratedMessageCompanion;
java.lang.NoSuchMethodError: com.google.protobuf.wrappers.DoubleValue$.messageCompanion()Lscalapb/GeneratedMessageCompanion;
    at scalapb.json4s.JsonFormat$.<init>(JsonFormat.scala:428)
    at scalapb.json4s.JsonFormat$.<clinit>(JsonFormat.scala)
Maciej Gorywoda
@makingthematrix
Hi all,
Have anyone used scalabp together with javalite? I want to have my proto files translated to Java, not Scala, but the resulting jar is quite big - 0.5MB for only 10kb of proto files. I'm trying to turn on the "lite" option, but so far it doesn't work.
When I set PB.protocOptions := Seq("--java_out=lite") and run sbt compile it warns me that protocOptions is not used.
Brice Jaglin
@bjaglin
@makingthematrix protocOptions must be set at the config level: Compile / PB.protocOptions (see https://github.com/thesamet/sbt-protoc#additional-options)
Brice Jaglin
@bjaglin

However, since the lite is not a protoc option but an option to the java generator, you should customize the target instead:

Compile / PB.targets += Target(PB.gens.java, (Compile / sourceManaged).value, Seq("lite")),

This way, sbt-protoc should honor the expected syntax https://github.com/protocolbuffers/protobuf/blob/master/java/lite.md

protoc --java_out=lite:${OUTPUT_DIR}

Direct usage of the Target constructor is not documented in sbt-protoc as there are many implicits to build it out of Generator and File, but with 3 arguments, I personnally find it clearer this way. See https://github.com/scalapb/protoc-bridge/blob/0214bdd17e9e3034d421bc51f38a4b7f34934478/bridge/src/main/scala/protocbridge/Target.scala.

Maciej Gorywoda
@makingthematrix
huh
let's try :)
okay, 270kb instead of 0.5MB :)
thank you!
Brice Jaglin
@bjaglin
great, I have opened thesamet/sbt-protoc#230 to document that
Nadav Samet
@thesamet
Hi @BBartosz, it is unlikely that only updating the Scala patch version would cause binary incompatibility. Run libraryDependencies to see if you are actually getting the versions you requested or there is eviction going on due to another dependency. To further debug, it would be easiest if you can share a minimal project that reproduces this problem, and file it as an issue with scalapb-json4s
Maciej Gorywoda
@makingthematrix

great, I have opened thesamet/sbt-protoc#230 to document that

So my question resulted in an improvement in the docs. That's awesome. Thank you.

Antonio
@AntonioYuen
Anyone know how to fix this? type InternalOneOfEnum is not a member of com.google.protobuf.AbstractMessage, I need to generate my java classes w/ my scala classes with javaConversion on
Vinh Le
@VQLE
Does anyone familiar with protobuf message serialized by "io.confluent.kafka.serializers.protobuf.KafkaProtobufSerializer"? I got error "Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message contained an invalid tag (zero)." when trying to deserialize with scalapb. It looks like the message contain additional information other than the protobuf schema.
Vinh Le
@VQLE
@chenharryhua, I ran into similar issue with "Kafka whose (de)serializer method has the signature def serialize(topic:String, data:A): bytes[]"... Were you able to solve it?
Hariprasath Thiagarajan
@Hprasath

Hi, I am facing a specific issue using scalapb to deserialize protobuf. A small snippet from the proto file that is under the scanner is

message MapEntry {
  oneof value {
    string string_value = 5;
    double number_value = 6;
    StrArray string_array_value = 14;
    DblArray number_array_value = 15;
  }
}

When I create the assembly jar using maven build and run it on databricks environment using spark 3 and scala 2.12, I get the following error message
java.lang.UnsupportedOperationException: Unable to find constructor for <package namespace>.MapEntry.Value. This could happen if <package namespace>.MapEntry.Value is an interface, or a trait without companion object constructor.
Some library information used in the project :
Scala version I am using is 2.12.10, scalapb-runtime version is 0.9.0-M5, sparksql-scalapb version is 0.11.0-RC1
Has anyone faced a similar issue?

Nadav Samet
@thesamet
@Hprasath The versions you stated are incompatible. Here is the version combinations that are supported: https://scalapb.github.io/docs/sparksql/#setting-up-your-project
1 reply
DSwift510
@DSwift510

Hi, I'm getting an exception stacktrace

when trying to perform val protoDF: DataFrame = ProtoSQL.protoToDataFrame(spark, rdd) even though I am importing the correct implicits as my last import:

import spark.implicits.StringToColumn
import scalapb.spark.Implicits._
import scalapb.spark.ProtoSQL

I'm running Spark 3.0.0, Scala 2.12.10, and I believe the shading is done correctly. here is the build.sbt to verify build.sbt

Nadav Samet
@thesamet

@DSwift510 @Hprasath , I'll need exact steps to reproduce to be able to further help with the issue you are seeing. Please create a minimal example of this problem and submit an issue on github at https://github.com/scalapb/sparksql-scalapb/issues.

To save some time building a reproducible example, you can fork https://github.com/thesamet/sparksql-scalapb-test. Follow the instruction on README.md to modify and test it locally. Please keep only the necessary changes to demonstrate the problem. Once you are able to reproduce the problem include a link to your fork in the ticket.

Alexander Khotyanov
@aksharp
@thesamet please take a look at this compile issue when generating proto files when you get a chance. scalapb/ScalaPB#1079
Hariprasath Thiagarajan
@Hprasath
Is there any link or documentation to do the shading of Google's Protobuf and scala-collection-compat on a maven build. I understand I can use the maven-shade-plugin. But what should be the package renamed to ?
Nadav Samet
@thesamet
@Hprasath There isn't a documentation for shading on a maven build. The name you rename to can be anything.
Vishwesh Vinchurkar
@Vishvin95
Hi, I am using "com.thesamet.scalapb" %% "sparksql-scalapb" % "0.11.0-RC1", "com.thesamet.scalapb" %% "compilerplugin" % "0.10.10" and addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.0") with Spark 3.0.1. When I am trying to build, I see that, google's protbuf library doesn't get included in JAR, but rather it takes sclapb-runtime library in that. The reason being both share the same package name: com.google.protobuf and I have assemblyMergeStrategy defined as MergeStrategy.first. How can I solve this conflict?
Nadav Samet
@thesamet
@Vishvin95 There is no overlap in the classes provided by scalapb-runtime and protobuf-java. Some of the classes in scalapb-runtime are generated code for standard protobufs, so it is undercom.google.protobuf, but due to the default naming scheme of ScalaPB there are no conflicts.
I think the reason you don't see com.google.protobuf in the jar is that it is being shaded.
Vishwesh Vinchurkar
@Vishvin95
@thesamet But when I submit the job to cluster, what I see is the error: java.lang.NoSuchMethodError: com.google.protobuf.CodedInputStream.readStringRequireUtf8()Ljava/lang/String; which I guess is due to this particular package not getting included in JAR. This comes from protobuf-java.
Nadav Samet
@thesamet
@Vishvin95 sounds like the shading doesn't work as expected. If it did it would look for shadeproto.CodedInputStream.readStringRequireUtf8
The error you are seeing results from the way you deploy the jar. The easiest way for me to help out is if you create a minimal example project that reproduces the issue.
This is a good starting point to fork and provide a reproducible example: https://github.com/thesamet/sparksql-scalapb-test
Fredrik Wärnsberg
@frekw

How is .transform on a zio-grpc service supposed to be used?

Trying out the example,

decoratedService = MyService.transform(
        new api.grpc.middleware.LoggingTransform[MyService.Env]
      )
      _ <- ServerLayer
        .fromServiceList(
          ServerBuilder
            .forPort(50051)
            .addService(ProtoReflectionService.newInstance()),
          ServiceList.add(decoratedService)
        )
        .build
        .useForever
        .orDie
        .fork

I end up with a missing implicit for ZBindableService,

 could not find implicit value for parameter b: scalapb.zio_grpc.ZBindableService[R1,myservice.ZService[MyService.Env zio.Has[scalapb.zio_grpc.RequestContext],MyService.Env with zio.Has[scalapb.zio_grpc.RequestContext]]]
The only difference from https://scalapb.github.io/zio-grpc/docs/decorating is that I'm extending RService since I require an environment
Fredrik Wärnsberg
@frekw
However, doing this seems to work just fine:
val decorated = TransformableService[ZService]
      .transform[
        MyService.Env,
        Has[RequestContext],
        MyService.Env,
        Has[RequestContext],
      ](MyService, new api.grpc.middleware.LoggingTransform)
      .toLayer

    val server = env >>> decorated >>> ServerLayer
      .access[ZService[Any, Has[
        RequestContext
      ]]](
        ServerBuilder
          .forPort(50051)
          .addService(ProtoReflectionService.newInstance())
      )
Fredrik Wärnsberg
@frekw
Hm, scrap that, as soon as I add more dependencies to my ZTransform I end up with the same ZBindableService-error