Protocol buffer compiler for Scala. Consider sponsoring ScalaPB: https://github.com/sponsors/thesamet
thesamet on master
Build for Scala Native, Scala 3… (compare)
thesamet on master
remove sbt-dotty plugin in docs… (compare)
mergify[bot] on master
Update scalatest to 3.2.13 (#27… (compare)
@thesamet I am looking forward to the next buf release.
Meanwhile I've tried to find the fs2-grpc compiler plugin executable but without much success.
Scalapb-validate's is published alongside the jars however fs2-grpc's is not there. In case I am looking at the wrong artifact, would you please point me to the correct one. Thank you :)
protocVersion
, but possibly there are uses of it that don't cause this error (if ScalaPB isn't invoked for example). For the specific use case you have, I suggest to use recent protoc and protobuf-java and use shading to avoid binary compatibility issues with Spark: https://scalapb.github.io/docs/sparksql#setting-up-your-project.
@thesamet I though it must be there already because that particular version was built after your PR got merged. I will try to reach out to the maintainers.
Regarding the Scalapb itself it would be much simpler for development if there was a Scalapb organization in the buf registry that publishes the proto files which later could be used like this instead of copying them manually.
@ngbinh yes you can find a simple example over here
Maybe I found a way to fix it.
diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala (revision 28d88a90150c9242ee9e3a57ff3b86da6f233ae9)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala (date 1636475671326)
@@ -69,8 +69,13 @@
input: Expression
): Expression = {
if (fd.isRepeated && !fd.isMapField) {
+ val nonNullInput = If(
+ IsNull(input),
+ input,
+ Literal.fromObject(Array.empty[String])
+ )
val objs = MapObjects(
- (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, input),
+ (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, nonNullInput),
input,
protoSql.singularDataType(fd)
)
This does not work "as-is". I am not very fluent with Catalyst ^^'
But this should be a good start.
singularDataType
of fd
.
@thesamet: This seems to work.
diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala (revision 07217938678b93309fb2a148f181e673fbfef21d)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala (date 1636493180889)
@@ -82,7 +82,10 @@
def mkMap(cmp: GeneratedMessageCompanion[_], args: ArrayData): Map[FieldDescriptor, PValue] = {
cmp.scalaDescriptor.fields
.zip(args.array)
- .filterNot(_._2 == PEmpty)
+ .filter {
+ case (_, null) | (_, PEmpty) => false
+ case _ => true
+ }
.toMap
.asInstanceOf[Map[FieldDescriptor, PValue]]
}
Too late to open the PR, I will do it tomorrow. Are you OK with the pattern matching or do you prefer another code construction.
Thanks for the advice. Hope it will solve my problem (I will test it on my production code tomorrow).
mkMap
does not seems to exist in this version...
It happens with a highly nested message, but on a root field which is a repeated nested message.
message Auction {
Request request...
reapeated Response responses...
...
}
If responses
is null it fails. But it seems to handle the other null-repeated fields properly.
addresses
field of the Person
message. So, I don't know what can cause this error...
spark.implicits._
and the protoSql.implicits._
Hey there!
Akka GRPC generates scala classes from proto files like this:
trait MyServicePowerApi extends MyService {
def foo(in: MyRequest, metadata: Metadata)
def foo(in: MyRequest) = throw new GrpcServiceException(Status.UNIMPLEMENTED)
}
Is it possible to configure ScalaPB somehow to generate the following code?
trait MyServicePowerApi extends MyService {
def foo(in: MyRequest, metadata: Metadata)
def foo(in: MyRequest) = foo(in, new GrpcMetadataImpl(new io.grpc.Metadata()))
}
Hi all, i'm trying to run sbt clean protocGenerate
on a macbook apple silicon M1. I get the following error
[error] lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe: not found: https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe
I have added PB.protocVersion := "3.17.3"
to my build.sbt
but it still fails. How do I fix this?
@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.
and why did you say about different metadata? it's the same