Protocol buffer compiler for Scala. Consider sponsoring ScalaPB: https://github.com/sponsors/thesamet
thesamet on 0.10.x
Return error message when seale… (compare)
thesamet on master
Return error message when seale… (compare)
mergify[bot] on master
Update scalatest to 3.2.5 (#164… (compare)
mergify[bot] on master
Update spark-sql to 3.1.1 (#168) (compare)
getName
Hello, @thesamet!
Is there any issues with scalapb/scalapb-grpcweb#79 ? I've tried to match error by status like I would do in grpc-java https://grpc.github.io/grpc-java/javadoc/io/grpc/StatusRuntimeException.html but in scala.js version the status
field is private.
Btw, thanks for the awesome library! It's easy to have autowire like library to handle 2.13, 3.0 and scala.js versions.
getStatus()
transformContextM
to preprocess a request, see https://scalapb.github.io/zio-grpc/docs/context#context-transformations
An exception or error caused a run to abort: com.google.protobuf.wrappers.DoubleValue$.messageCompanion()Lscalapb/GeneratedMessageCompanion;
java.lang.NoSuchMethodError: com.google.protobuf.wrappers.DoubleValue$.messageCompanion()Lscalapb/GeneratedMessageCompanion;
at scalapb.json4s.JsonFormat$.<init>(JsonFormat.scala:428)
at scalapb.json4s.JsonFormat$.<clinit>(JsonFormat.scala)
PB.protocOptions := Seq("--java_out=lite")
and run sbt compile
it warns me that protocOptions
is not used.
protocOptions
must be set at the config level: Compile / PB.protocOptions
(see https://github.com/thesamet/sbt-protoc#additional-options)
However, since the lite
is not a protoc option but an option to the java generator, you should customize the target instead:
Compile / PB.targets += Target(PB.gens.java, (Compile / sourceManaged).value, Seq("lite")),
This way, sbt-protoc should honor the expected syntax https://github.com/protocolbuffers/protobuf/blob/master/java/lite.md
protoc --java_out=lite:${OUTPUT_DIR}
Direct usage of the Target
constructor is not documented in sbt-protoc as there are many implicits to build it out of Generator
and File
, but with 3 arguments, I personnally find it clearer this way. See https://github.com/scalapb/protoc-bridge/blob/0214bdd17e9e3034d421bc51f38a4b7f34934478/bridge/src/main/scala/protocbridge/Target.scala.
libraryDependencies
to see if you are actually getting the versions you requested or there is eviction going on due to another dependency. To further debug, it would be easiest if you can share a minimal project that reproduces this problem, and file it as an issue with scalapb-json4s
great, I have opened thesamet/sbt-protoc#230 to document that
So my question resulted in an improvement in the docs. That's awesome. Thank you.
Hi, I am facing a specific issue using scalapb to deserialize protobuf. A small snippet from the proto file that is under the scanner is
message MapEntry {
oneof value {
string string_value = 5;
double number_value = 6;
StrArray string_array_value = 14;
DblArray number_array_value = 15;
}
}
When I create the assembly jar using maven build and run it on databricks environment using spark 3 and scala 2.12, I get the following error messagejava.lang.UnsupportedOperationException: Unable to find constructor for <package namespace>.MapEntry.Value. This could happen if <package namespace>.MapEntry.Value is an interface, or a trait without companion object constructor.
Some library information used in the project :
Scala version I am using is 2.12.10
, scalapb-runtime version is 0.9.0-M5, sparksql-scalapb version is 0.11.0-RC1
Has anyone faced a similar issue?
Hi, I'm getting an exception stacktrace
when trying to perform val protoDF: DataFrame = ProtoSQL.protoToDataFrame(spark, rdd)
even though I am importing the correct implicits as my last import:
import spark.implicits.StringToColumn
import scalapb.spark.Implicits._
import scalapb.spark.ProtoSQL
I'm running Spark 3.0.0, Scala 2.12.10, and I believe the shading is done correctly. here is the build.sbt to verify build.sbt
@DSwift510 @Hprasath , I'll need exact steps to reproduce to be able to further help with the issue you are seeing. Please create a minimal example of this problem and submit an issue on github at https://github.com/scalapb/sparksql-scalapb/issues.
To save some time building a reproducible example, you can fork https://github.com/thesamet/sparksql-scalapb-test. Follow the instruction on README.md
to modify and test it locally. Please keep only the necessary changes to demonstrate the problem. Once you are able to reproduce the problem include a link to your fork in the ticket.
com.google.protobuf
, but due to the default naming scheme of ScalaPB there are no conflicts.
How is .transform
on a zio-grpc service supposed to be used?
Trying out the example,
decoratedService = MyService.transform(
new api.grpc.middleware.LoggingTransform[MyService.Env]
)
_ <- ServerLayer
.fromServiceList(
ServerBuilder
.forPort(50051)
.addService(ProtoReflectionService.newInstance()),
ServiceList.add(decoratedService)
)
.build
.useForever
.orDie
.fork
I end up with a missing implicit for ZBindableService
,
could not find implicit value for parameter b: scalapb.zio_grpc.ZBindableService[R1,myservice.ZService[MyService.Env zio.Has[scalapb.zio_grpc.RequestContext],MyService.Env with zio.Has[scalapb.zio_grpc.RequestContext]]]
RService
since I require an environment
val decorated = TransformableService[ZService]
.transform[
MyService.Env,
Has[RequestContext],
MyService.Env,
Has[RequestContext],
](MyService, new api.grpc.middleware.LoggingTransform)
.toLayer
val server = env >>> decorated >>> ServerLayer
.access[ZService[Any, Has[
RequestContext
]]](
ServerBuilder
.forPort(50051)
.addService(ProtoReflectionService.newInstance())
)