Protocol buffer compiler for Scala. Consider sponsoring ScalaPB: https://github.com/sponsors/thesamet
thesamet on master
Add an FAQ about M1 (compare)
thesamet on master
Update docs about shading shape… (compare)
thesamet on master
Add test for (String, ByteStrin… (compare)
thesamet on master
Update docs for sparksql-scalapb (compare)
thesamet on v1.0.0
thesamet on master
Remove Scala 3 from supported n… (compare)
thesamet on master
Fix auto-application warnings Update frameless to 0.12.0 (compare)
thesamet on master
Update docs for sparksql-scalap… (compare)
thesamet on master
Update docs for sparksql-scalap… (compare)
message foo {
string cause = 1;
}
message bar {
string data = 1;
}
message my_trait {
string id = 1;
oneof sealed_value {
foo fooEl = 2;
bar barEl = 3;
}
}
zio2
branch. The first test TestServiceSpec.serverStreamingSuite
is failing and I’m having a hard time figuring out why. The test is expecting a response stream with two eleemnts Response(“X1”), Response(“X2”)
but the Stream only contains the first element. I’ve done a bit of debugging and the stream is exiting successfully. I can’t seem to figure out why the 2nd stream element is never included in the response.
[error] Unspecified value parameter message.
[error] __inner = Option(__inner.fold(_root_.scalapb.LiteParser.readMessage[livongo.microservice.util.filtering.Testing.FilterTarget.Nested](_input__))(_root_.scalapb.LiteParser.readMessage(_input__, _)))
[error] ^
[error] /Users/peter.zhong/Desktop/platform-libraries/microservice-util/target/scala-2.13/src_managed/test/scalapb/livongo/microservice/util/filtering/Testing/FilterTarget.scala:433:127: not enough arguments for method readMessage: (input: com.google.protobuf.CodedInputStream, message: livongo.microservice.util.filtering.Testing.FilterTarget.Nested)(implicit cmp: scalapb.GeneratedMessageCompanion[livongo.microservice.util.filtering.Testing.FilterTarget.Nested]): livongo.microservice.util.filtering.Testing.FilterTarget.Nested.
[error] Unspecified value parameter message.
[error] __values += _root_.scalapb.LiteParser.readMessage[livongo.microservice.util.filtering.Testing.FilterTarget.Nested](_input__)
[error]
{"ListField": ["foo", "bar"]}
{"ListField": []}
But this one
{"ListField": ["foo", "bar"]}
{"ListField": []}
{}
fails with
Exception in thread "main" java.lang.NullPointerException
at com.adyoulike.data.proto.repeated.MyRepeated$.$anonfun$messageReads$4(MyRepeated.scala:84)
at scala.Option.map(Option.scala:230)
at com.adyoulike.data.proto.repeated.MyRepeated$.$anonfun$messageReads$1(MyRepeated.scala:84)
(that is to say the __fieldMap.get(...)
line
Option[Seq[...]]
OuterCaseClass
and OuterCaseClassTimestamp
to see an example. For the second case class it uses frameless injections to customize the timestamp type: https://github.com/scalapb/sparksql-scalapb/blob/4017c0c2dea0e261e731547fbdee1202a2e61b16/sparksql-scalapb/src/test/scala/PersonSpec.scala#L34-L35
null
s for the missing value. ScalaPB expects empty arrays and not nulls when parsing a dataframe. I think it will be useful to have ScalaPB accept nulls when parsing spark dataframes. You could modify your input sources or prepare it for consumption by ScalaPB by replacing the nulls. If supporting nulls is something that would be useful for your project, feel free to file an issue with sparksql-scalapb.