Protocol buffer compiler for Scala. Consider sponsoring ScalaPB: https://github.com/sponsors/thesamet
thesamet on master
Build for Scala Native, Scala 3… (compare)
thesamet on master
remove sbt-dotty plugin in docs… (compare)
mergify[bot] on master
Update scalatest to 3.2.13 (#27… (compare)
@thesamet I though it must be there already because that particular version was built after your PR got merged. I will try to reach out to the maintainers.
Regarding the Scalapb itself it would be much simpler for development if there was a Scalapb organization in the buf registry that publishes the proto files which later could be used like this instead of copying them manually.
@ngbinh yes you can find a simple example over here
Maybe I found a way to fix it.
diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala (revision 28d88a90150c9242ee9e3a57ff3b86da6f233ae9)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/FromCatalystHelpers.scala (date 1636475671326)
@@ -69,8 +69,13 @@
input: Expression
): Expression = {
if (fd.isRepeated && !fd.isMapField) {
+ val nonNullInput = If(
+ IsNull(input),
+ input,
+ Literal.fromObject(Array.empty[String])
+ )
val objs = MapObjects(
- (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, input),
+ (input: Expression) => singleFieldValueFromCatalyst(cmp, fd, nonNullInput),
input,
protoSql.singularDataType(fd)
)
This does not work "as-is". I am not very fluent with Catalyst ^^'
But this should be a good start.
singularDataType
of fd
.
@thesamet: This seems to work.
diff --git a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala
--- a/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala (revision 07217938678b93309fb2a148f181e673fbfef21d)
+++ b/sparksql-scalapb/src/main/scala/scalapb/spark/JavaHelpers.scala (date 1636493180889)
@@ -82,7 +82,10 @@
def mkMap(cmp: GeneratedMessageCompanion[_], args: ArrayData): Map[FieldDescriptor, PValue] = {
cmp.scalaDescriptor.fields
.zip(args.array)
- .filterNot(_._2 == PEmpty)
+ .filter {
+ case (_, null) | (_, PEmpty) => false
+ case _ => true
+ }
.toMap
.asInstanceOf[Map[FieldDescriptor, PValue]]
}
Too late to open the PR, I will do it tomorrow. Are you OK with the pattern matching or do you prefer another code construction.
Thanks for the advice. Hope it will solve my problem (I will test it on my production code tomorrow).
mkMap
does not seems to exist in this version...
It happens with a highly nested message, but on a root field which is a repeated nested message.
message Auction {
Request request...
reapeated Response responses...
...
}
If responses
is null it fails. But it seems to handle the other null-repeated fields properly.
addresses
field of the Person
message. So, I don't know what can cause this error...
spark.implicits._
and the protoSql.implicits._
Hey there!
Akka GRPC generates scala classes from proto files like this:
trait MyServicePowerApi extends MyService {
def foo(in: MyRequest, metadata: Metadata)
def foo(in: MyRequest) = throw new GrpcServiceException(Status.UNIMPLEMENTED)
}
Is it possible to configure ScalaPB somehow to generate the following code?
trait MyServicePowerApi extends MyService {
def foo(in: MyRequest, metadata: Metadata)
def foo(in: MyRequest) = foo(in, new GrpcMetadataImpl(new io.grpc.Metadata()))
}
Hi all, i'm trying to run sbt clean protocGenerate
on a macbook apple silicon M1. I get the following error
[error] lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe: not found: https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.15.6/protoc-3.15.6-osx-aarch_64.exe
I have added PB.protocVersion := "3.17.3"
to my build.sbt
but it still fails. How do I fix this?
@squadgazzz ScalaPB doesn't have a way to generate services where metadata is passed as a parameter to every rpc call.
and why did you say about different metadata? it's the same
client.withInterceptors(MetadataUtils.newAttachHeadersInterceptor(metadata))
v0.10.5
release soon? Do I need to perform any action?
Hi there, just recently attempted to update one of my projects from sbt 1.4.9 to 1.5.5 and am now unable to resolve the binary protoc-gen-validate
dependencies:
[error] (client / update) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.protoc-plugin: not found: https://repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.protoc-plugin
it seems to be appending the wrong extension (.protoc-plugin
vs .exe
), the same dependency resolved via sbt 1.4.9:
sbt:project> show client/protobuf:managedClasspath
...
[info] * Attributed(/Users/dk/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/io/envoyproxy/protoc-gen-validate/protoc-gen-validate/0.6.2/protoc-gen-validate-0.6.2-osx-x86_64.exe)
Curious if anyone can explain the different extension? or suggest a workaround/fix
.dependsOn()
. I've put together this minimal example that reproduces the issue: https://github.com/dkichler/protoc-plugin-resolution-issue