Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 00:08
    thesamet commented #130
  • Oct 26 23:54
    FabioPinheiro commented #130
  • Oct 26 23:39
    FabioPinheiro edited #130
  • Oct 25 15:53
    thesamet commented #130
  • Oct 24 23:58
    FabioPinheiro opened #130
  • Oct 24 01:11

    mergify[bot] on master

    Update sbt-unidoc to 0.5.0 (#12… (compare)

  • Oct 24 01:11
    mergify[bot] closed #1256
  • Oct 24 00:50
    mergify[bot] labeled #1256
  • Oct 24 00:50
    mergify[bot] assigned #1256
  • Oct 24 00:50
    scala-steward opened #1256
  • Oct 21 17:25

    mergify[bot] on master

    Update auxlib, javalib, nativel… (compare)

  • Oct 21 17:25
    mergify[bot] closed #1255
  • Oct 21 17:02
    mergify[bot] labeled #1255
  • Oct 21 17:02
    mergify[bot] assigned #1255
  • Oct 21 17:02
    mergify[bot] labeled #143
  • Oct 21 17:02
    mergify[bot] assigned #143
  • Oct 21 17:02
    scala-steward opened #1255
  • Oct 21 17:01
    scala-steward opened #143
  • Oct 21 00:30

    mergify[bot] on master

    Update scalafmt-core to 3.0.7 (… (compare)

  • Oct 21 00:30
    mergify[bot] closed #212
Nadav Samet
@thesamet
@remiguittaut for the first question, it would be better to ask grpc-java folks. ScalaPB/zio-grpc uses it directly for building the server and for handling request parsing. For the second question, I suspect the answer is no, due to limitation in grpc-java. Maybe your app can host it on two different ports and use nginx, envoy or another proxy to make it appear on the same port from the outside. Using a proxy might also solve your first question.
Alexandre Lebrun
@xela85
Hello !
I have recently discovered prototool and buf and would like to enable one of these linters in my schemas repository. I have initiated a naive CI by running buf on proto resources. Unfortunately import "scalapb/scalapb.proto"; is unresolved and leads to compilation errors. How do you circumvent these types of errors (same problem in the IJ plugin) ? Thanks in advance :)
Philipp Hoffmann
@philipphoffmann
Hey there! We could need some help using ScalaPB together with Spark. Basically our question is, how do we deal with protobuf extensions in Spark? We figured out already that ScalaPB generates Scala case classes for our protobuf extensions just fine. But deserializing our binaries in Spark gives us a DataFrame with the schema inferred for our root message only (without any extension fields). I think this is more or less expected since we use something like RootMessage.parseFrom(...) to deserialize which only gives us the RootMessage. So i guess my question is, how can we deserialize the extension fields? Thx for you help :)
Nadav Samet
@thesamet
@xela85 add a copy of scalapb/scalapb.proto somewhere to your source tree or environment and provide the path to the parent directory (one containing scalapb as a subdirectory) to the search path used by those linters.
@philipphoffmann Currently there's no support for extensions in sparksql-scalapb. Theoretically there could be many extensions to any message so the schema can be pretty large if we take the union. There's also no facility to discover all extensions for a given message. However, if there's a specific extension you are expecting you could do { payload => val msg = RootMessage.parseFrom(payload); (msg, msg.extension(ext))} to return both the message and the extension you expect as a tuple (or define a new case class), and transform it as a dataframe from there.
Philipp Hoffmann
@philipphoffmann
@thesamet excellent Nadav. Thats exactly our missing piece to the puzzle :) thx a lot!
Alexandre Lebrun
@xela85
Perfect, thank you :)
Mike Dias
@mikedias
Hello, it seems that the sparksql-scalapb artifact doesn't have the latest 5 versions published. Is that expected?
Nadav Samet
@thesamet
Hi @mikedias , there are no unpublished artifacts. I assume you expect the version to match the ScalaPB release version, but those two libraries are released independently. The versions of sparksql-scalapb and scalapb need to match according to the table here https://scalapb.github.io/docs/sparksql/#setting-up-your-project
Rohit Ramprasad
@rrampr

Hello this might be a simple question but I don't see a straightforward way around it . ScalaPBC generates case classes from proto definitions. The regular protobuf compiler can generate Java classes (that inherit from com.google.protobuf.GeneratedMessageV3 and implements com.google.protobuf.Message).

If Im writing a function that can naively operate on both generated proto classes, how would I do so? There is no common ancestor or interface I can use.
Reason I want this: Writing a small internal library in java that will be used within scala services as well as java services. The input to one of my functions is gonna be a protobuf type. I can't use Message as the type of the argument as it wouldn't work from a scala service

Nadav Samet
@thesamet
@rrampr You can write two functions. They can even have the same name since the parameter types are going to be different.
1 reply
There is nothing that abstracts over the two protobuf types. If you want to minimize duplication you can use the typeclass pattern to have an implicit typeclass provided that would provide the abstractions you need over the two proto types.
Timothy Bess
@tdbgamer

Hey, so is there a generic way to reference the Java class associated with a generated Scala proto?

I wanted to provide an interface that takes A and B that are GeneratedMessages and automatically serialize to/from Java protos and provides a Scala only interface to the user. Currently I have to do:
abstract class MyInterface[A, AJava, B, BJava](implicit aJavaSupport: JavaProtoSupport[A, AJava], bJavaSupport: JavaProtoSupport[B, BJava])
Which works, but it's annoying cause the user has to fill out a giant type signature. Ideally they shouldn't have to care about or reference the Java classes. Any ideas?

Nadav Samet
@thesamet
Hi @tdbgamer , I think this can be improved if we introduce the java type as a dependent type in the JavaProtoSupport. Can you provide additional context on what you are trying to enable for the users? What methods will you have in this abstract class? If the intent is to make it easier to convert Java to Scala protos why it needs two types (A and B)?
Timothy Bess
@tdbgamer
I'm making an internal wrapper around Kafka where each service takes an input proto and returns an output proto. I basically want those services to only have to care about Scala protos, and I will transparently convert to/from java protos when I write to or read from Kafka.
Timothy Bess
@tdbgamer

https://pastebin.com/n0UbrFum

^ This is the base class

essentially uses the JavaProtoSupport to create RecordSerializers and RecordDeserializers for the scala protos
Nadav Samet
@thesamet
Got it. Makes sense. Does Kafka care about the Java protos at all? I haven't worked with it much, but I thought it can accept arbitrary payloads (arrays of bytes)
Nadav Samet
@thesamet
@tdbgamer if we want MyInterface to only depend on A and B, we could introduce some typeclass like JavaProtoSupport[A<:GeneratedMessage]. Sounds like what you need in it is: parseFromJava(a: A#JavaType): A and scalaToJava(a: A): A#JavaType. Any other requirements?
6 replies
Timothy Bess
@tdbgamer

It can, but schema registry uses the Java protos (I assume through the file descriptor) to track changes to schema, ensure backwards compatibility, and allow for dynamically deserializing protobuf messages.

But there are lots of tools in the Kafka ecosystem that need to dynamically deserialize messages on topics, and to do that with protobuf, you need to have the schema of the message that was published since protobuf doesn't encode type information into the binary format. So Kafka consumer/producers essentially encode the schema id in schema registry with each message that is published. Then consumers can essentially pull down a message, get the schema id out, reach out to schema registry to get the proto definition, and use that definition to deserialize the message.

1 reply
So if you ETL data from Kafka to S3 dynamically, it doesn't have to know all the schemas, it can just use the schema registry to make a DynamicMessage and put it in S3 as JSON or whatever
Rohit Ramprasad
@rrampr
We are using scalapb with bazel and not sbt. Is there a way to flip on javaConversions=true with bazel
2 replies
Nimrod Sadeh
@nsadeh
Hello, I am having trouble understanding the example in https://scalapb.github.io/zio-grpc/docs/decorating. I am trying to have ZIO Logging available to my services, what would that look like? I am not sure I understand how this ZTransfrom works
Nadav Samet
@thesamet
Hi @nsadeh , I haven't tried using zio-logging, however @frekw once looked into it (https://github.com/scalapb/zio-grpc/issues/52#issuecomment-628612446) - @frekw will you be able to help out here?
3 replies
Nadav Samet
@thesamet

@tdbgamer , I played with it a little bit today. It seems like the change would require introducing duplication (two classes for JavaProtoSupport, one with two type parameters and one with dependent-type), or if I eliminate the old one it would make a breaking change, so I want to think about it more. However, another approach could be adding implicit parameter to all method signatures in your interface. This makes the Java type automatically found:

class MyGenericClass[A <: scalapb.GeneratedMessage] {
  def toJava[JT](s: A)(implicit jps: scalapb.JavaProtoSupport[A, JT]): JT =
    jps.toJavaProto(s)
}

This makes the Java type (JT), automatically inferred by the compiler. Your user calls myGenericCall.toJava(scalaProto)

Timothy Bess
@tdbgamer
Ah I think the issue is I was trying to have the superclass use the implicits to generate a bunch of functions for the subclass, so the JavaProtoSupport implicit had to be on the constructor or I end up having to do a lot of boilerplate overrides on the subclass. I'll test that out again though in a sec and see if it works
Timothy Bess
@tdbgamer
Screenshot from 2021-09-15 11-29-54.png
So the issue is these RecordSerializer and RecordDeserializerimplicits are provided as long as you have a ProtobufSettings and JavaProtoSupport implicit.
Timothy Bess
@tdbgamer
I could probably attach the implicit to that def directly, but I was trying to keep the protobuf implicits from permeating all the other superclasses so that the library could support other serialization.
Nimrod Sadeh
@nsadeh
@thesamet I am looking at https://github.com/scalapb/zio-grpc/blob/master/core/src/main/scalajvm/scalapb/zio_grpc/ServerMain.scala to add Logging as a dependency on services without context transforms, what's the role of ProtoReflectionService?
Nimrod Sadeh
@nsadeh
Asking because the .addService invocation seems to confuse the type signature
Nadav Samet
@thesamet
ProtoReflectionService makes it possible for grpc clients to list the services and methods available on a grpc server
I'll have limited internet connectivity for the rest of September. I will read everything once I'm back and suggest to file GitHub issues for anything that requires follow up.
Rohit Ramprasad
@rrampr
Running into odd issues when using com.google.protobuf.any.Any. There are 2 overloaded pack methods that can be invoked on Any and the compiler gets confused which one to invoke
error: overloaded method pack with alternatives: [A <: scalapb.GeneratedMessage with scalapb.Message[A]](generatedMessage: A, urlPrefix: String): com.google.protobuf.any.Any <and> [A <: scalapb.GeneratedMessage with scalapb.Message[A]](generatedMessage: A): com.google.protobuf.any.Any cannot be applied to (generatedMessage: scalapb.GeneratedMessage with scalapb.Message[_]) event = Any.pack(generatedMessage = message),
Sathyaprakash Dhanabal
@happysathya

Hi, I am using Mac M1.

While using the scalapb compiler plugin I am running into trouble.

In plugin.sbt,

addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.3")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.11.5"

In build.sbt, I am trying to override the protoc version

PB.protocVersion := "3.17.3"

so that I can use the protoc osx-aarch_64 architecture. But I am getting the following error
"Could not find artifact com.google.protobuf:protoc:exe:osx-aarch_64:3.15.6". Not sure why it is still referring to 3.15.6

Any suggestions.

1 reply
Fredrik Wärnsberg
@frekw
In zio-grp, is there any way to intercept a client call via ZClientInterceptor such that you get to wrap the actual outgoing call? I'm trying to write an interceptor that automatically adds metrics + tracing for any zio-grpc client
Is my best bet to just grab the ClientCall.Listener from a ForwardingZClientCall and run unsafeRun from onMessage?
Fredrik Wärnsberg
@frekw
No wait that won't work
adriansnep16
@adriansnep16
hi!! I'm new to ScalaPB and I have a question....in proto3 how can i create a sealed trait containing also fields ? because as far as I documented until now oneof sealed_value doesn't allow it
Nadav Samet
@thesamet
@adriansnep16 you mean having primitive types (ints, strings) and not just messages? Scala wouldn't allow it since all the types that extend a sealed trait must be defined in the same file, and ints and strings are already defined elsewhere.
@frekw I have limited internet connectivity in the coming week, if needed feel free to file an issue on GitHub and I'll get back to it.
adriansnep16
@adriansnep16
no i mean a structure like this

message foo {
string cause = 1;
}

message bar {
string data = 1;
}

message my_trait {
string id = 1;
oneof sealed_value {
foo fooEl = 2;
bar barEl = 3;
}
}

this will fail at compile time because of "string id" which is present in "mytrait" because "oneof sealed.." doesn't allow additional fields ....and my question is if there is a possibility of modeling "my_trait" to also contain the "string id" field
Nadav Samet
@thesamet
I think that what you want is to create another message MyTraitWithId which would have two fields: string id and 'my_trait mt'
Fredrik Wärnsberg
@frekw
@thesamet will do :pray:
Karan Shah
@karanssh
Hi folks! I am trying to debug a weird issue I am running into. I have a scala grpc server, and clients in go and java. I can connect from the go, java clients fine on ubuntu where I build all of them, but when I compile them on a centOS box and try, the scala grpc server spins up, accepts connections but never responds. Any pointers on how I could debug this would be appreciated
Nadav Samet
@thesamet
Hi @Karen, scalapb uses grpc-java under the hood. It sounds like the issue is unrelated to scalapb, given that it happens when you switch operating systems of the non-scala clients. I suggest to reproduce it with an example have server and client and ask on a grpc help forum.