Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 14:45
    ya-goodfella commented #1371
  • 14:45
    ya-goodfella commented #1352
  • 13:59
    GreenBombel commented #267
  • 13:35

    thesamet on master

    Update docs for sparksql-scalapb (compare)

  • 13:32
    thesamet closed #267
  • 13:32
    thesamet commented #267
  • 13:28
    thesamet commented #1374
  • 13:17
    thesamet commented #1373
  • 12:54

    thesamet on v1.0.0

    (compare)

  • 12:45

    thesamet on master

    Remove Scala 3 from supported n… (compare)

  • 12:34

    thesamet on master

    Fix auto-application warnings Update frameless to 0.12.0 (compare)

  • 12:33

    thesamet on master

    Update docs for sparksql-scalap… (compare)

  • 12:33

    thesamet on master

    Update docs for sparksql-scalap… (compare)

  • 10:39
    GreenBombel commented #267
  • 10:12
    ya-goodfella opened #1374
  • 09:54
    GreenBombel edited #267
  • 09:53
    GreenBombel opened #267
  • 09:53
    GreenBombel closed #1372
  • 09:53
    GreenBombel commented #1372
  • 09:38
    ya-goodfella synchronize #1371
Nadav Samet
@thesamet
Another approach would be to have a match type that considers unset as false.
That is, it's another MatchType, like CONTAINS that has the desired behavior above.
molotch
@molotch

Hi,

When I use a sub project in sbt for the proto files all generated files show the error

object scalapb is not a member of package <root> did yo mean scala?"

Couldn't find any explicit instructions on how to proceed when using subprojects so I'm a bit lost on what the problem could be.

Nadav Samet
@thesamet
@molotch it sounds like PB.targets is not inside the subproject's settings.
Nadav Samet
@thesamet
PB.targets goes inside subproject, having scalapb.gen as a target adds scalapb-runtime as a dependency to the subproject.
The error you get indicate scalapb-runtime didn't get automatically added to your project.
If this doesn't resolve this issue, please share build.sbt and the project's directory structure.
molotch
@molotch
Thanks! I'll look into it and get back if I don't get it to work.
Luka Jurukovski
@luka_jurukovski_twitter

Hello again! I imagine this is me missing something again, but I'm at a bit of a loss as to what I should do. When importing the proto-google-common-protos-scalapb I'm running into this error:

object TimeOfDayValidator is not a member of package com.google.type.timeofday

And other such errors like this associated with google common objects. If I understand the problem correctly it's because validations are not generated in proto-google-common-protos-scalapb so the code generation is referencing something that doesn't exist when I do validation in my files.

This sounds like the problem that the skip feature is trying to address so I tried to use it. However I think it's not available in this case due to the fact that a package scope setting already defined for google, so it ends up clashing when I define it.

Nadav Samet
@thesamet
@luka_jurukovski_twitter can you file a github issue for this, and link to a github repo with a minimal project that demonstrates this?
Luka Jurukovski
@luka_jurukovski_twitter
I will attempt to, unfortunately the current case is not code I can share
Nadav Samet
@thesamet
Sure, that's exactly the intent in "minimal project", remove all unnecessary details.
Luka Jurukovski
@luka_jurukovski_twitter
Yep, just wanted to ack your request since you have always been so prompt with answering questions :)
Turns out it was not difficult to replicate. Spent a second to validate that this setup also works when not using google-common.
https://github.com/11Dimensions/simple-scalapb-project
Nadav Samet
@thesamet
Perfect, thanks for the clean reproduction of the issues. This cuts the effort it takes to identify a solution and test it on my end. I'll be looking into this in the coming days.
Luka Jurukovski
@luka_jurukovski_twitter
Sounds good, I can open an issue if you can confirm that this isn't a case of PEBCAK. If it is I apologize in advance. And as always appreciate the engagement and your project!
Nadav Samet
@thesamet
No, it seems like it's a real issue - there's no way to specify a boundary to stop validating third party protos if they already have package options defined.
Luka Jurukovski
@luka_jurukovski_twitter
Created an issue and attempted to record what was discussed here
scalapb/ScalaPB#1358
joramnv
@joramnv

Hi, I’m trying to use the compiled result of ScalaPB in another project. But whenever I try to use the generated classes in the other project, I am getting a compilation error:

[error] java.lang.AssertionError: assertion failed: module class MyProtoAsScalaClass$ has non-class parent: val <none>
[error] scala.runtime.Scala3RunTime$.assertFailed(Scala3RunTime.scala:8)
[error] dotty.tools.dotc.core.SymDenotations$ClassDenotation.traverse$1(SymDenotations.scala:1899)
[error] dotty.tools.dotc.core.SymDenotations$ClassDenotation.computeBaseData(SymDenotations.scala:1904)
[error] dotty.tools.dotc.core.SymDenotations$BaseDataImpl.apply(SymDenotations.scala:2872)
[error] dotty.tools.dotc.core.SymDenotations$ClassDenotation.baseData(SymDenotations.scala:1870)
...

Using Scala 3.1.1 in both projects.

Nadav Samet
@thesamet
@joramnv nothing comes to mind from seeing this. Can you file an issue on github with exact steps to reproduce this?
2 replies
Luka Jurukovski
@luka_jurukovski_twitter

scalapb/common-protos#165

Probably not the right way to solve 1358, but in just in case it might be.

Nadav Samet
@thesamet

@joramnv The reason you are seeing this problem is that you are copying the generated jar, but your project still misses other jars that are the dependencies of the jar (scalapb-runtime_3, lenses_3, protobuf-java) . I validated that if these jars are copied too then it works. The compiler error message you see could better, but that's a Scala compiler bug.

Is there a reason why you try to use the generated classes in another project by manually copying the jar? Are you familiar with publish and publishLocal that can help you introduce a dependency between the projects such that all the necessary dependencies will be downloaded for you?

joramnv
@joramnv
The reason is manually copying is the easiest to start out with (at least I thought so, but I’ve been proven wrong). It didn’t occur to me I’d need transitive dependencies for using the generated classes.
joramnv
@joramnv
Thanks for your help @thesamet, it works as expected. :smiley:
Vincent Lafeychine
@v-lafeychine:matrix.org
[m]
Hi, I was wondering if Scala Native 3.x support will get completed.
It seems that the patch is on standby (scalapb/protobuf-scala-runtime#169), is there any news?
Nadav Samet
@thesamet
Hi @v-lafeychine:matrix.org I migrated protobuf-scala-runtime. ScalaPB relies on munit for testing which is yet to support Scala 3 native. I filed this request: scalameta/munit#524
Vincent Lafeychine
@v-lafeychine:matrix.org
[m]
Thanks for your quick reply!
Jeroen Knoef
@JeroenKnoef-TomTom
Hi, we're using ScalaPB for Scala, Java and Python. The python consumers now have requested to also generate mypy bindings. This boils down to installing another protoc plugin (https://github.com/nipunn1313/mypy-protobuf) and add a --mypy_out=location option to the command. Is there a way to integrate this in build.sbt?
Brice Jaglin
@bjaglin
@JeroenKnoef-TomTom you can have a look at https://github.com/thesamet/sbt-protoc#to-invoke-a-plugin-that-is-already-locally-installed. If pip install is fast, you can probably do the install via https://www.scala-sbt.org/1.x/docs/Process.html directly within Compile / PB.targets.
Jeroen Knoef
@JeroenKnoef-TomTom
Thanks @bjaglin, that looks promising.
Denis Savitsky
@desavitsky
Hi!
Are there plans to release zio2-based version of zio-grpc?
Nadav Samet
@thesamet
Hi @densavi96 There is no plan, but there's intention. Help is needed to get the zio2 branch in working order - would love to have someone drive this to completion.
25 replies
richg75
@richg75
HI - I've a question on customisations. I need to remove the root.scala.Option[myDataType] surrounding myDataType. I've tried an primitive wrappers options statement at both package and file level to remove the primitive wrappers but it doesn't remove them. Can you advise if the primitive wrappers option should remove the root.scala.Option[] wrapper, and provide more information as to where (package or file) that option needs to be set? Ideally I want to set it at the package level to avoid having to change the proto files. Many thanks for your help in advance! :)
Nadav Samet
@thesamet
@richg75 Look for the no_box option in https://scalapb.github.io/docs/customizations/. If you only want to target a few fields by name without modifying the original files, look for "Auxiliary options". If you want to set up a rule for an entire package, see https://scalapb.github.io/docs/transformations
Szava Maczika
@maczikasz:matrix.org
[m]

Hi peeps, I am trying to use scalapb from a Jupyter notebook, but it seems I cannot get the implicits working

I am trying to just create a DataFrame from a tuple containing a ByteString but I always get the No Encoder found for repackaged.com.google.protobuf.ByteString (shaded version, but also get for without repackaged)

Is there a way to create the encoders explicitly so I can just set it like that? Or is that not really feasible?

Nadav Samet
@thesamet
Are the Spark-ScalaPB's implicits available in the scope (jupyter cell) where they are needed?
Szava Maczika
@maczikasz:matrix.org
[m]
I imported them right above my code
Szava Maczika
@maczikasz:matrix.org
[m]
It's running on amazon EMR
Nadav Samet
@thesamet
I am not sure which Scala kernel is used in your setup, but many of them add an import of spark.implicits._ which messes things up. Here's an example: https://github.com/vericast/spylon-kernel/blob/2d0ddf2aca1b91738f938b72a500c20293e3156c/spylon_kernel/scala_interpreter.py#L222 - in this case it looks hard-coded and not customizable.
Nadav Samet
@thesamet
Szava Maczika
@maczikasz:matrix.org
[m]
Thanks it seems you were right and our kernel has implciits automatically included :(
Nadav Samet
@thesamet
@maczikasz:matrix.org besides passing the implicits explicitly you can build and deploy a jar with your own pre-compiled functions that do some dataframe/dataset transformations with the right imports and use them from jupyter. Not as interactive as it potentially can be, but works.
Yakushev Aleksey
@AlexGruPerm
Nadav Samet
@thesamet
Does the request make it to server? It looks like the server is using https port - but I suspect the client isn't encrypting, maybe that's the issue? @AlexGruPerm
Yakushev Aleksey
@AlexGruPerm
I don't sure about this question "Does the request make it to server?". But it look like Yes,because I have response
HEADERS: streamId=3 headers=GrpcHttp2ResponseHeaders[:status: 400, content-length
And client isn encrypting, I use manChannelBuilder.useTransportSecurity()
Nadav Samet
@thesamet
@AlexGruPerm I suggest doing a bit of divide and conquer to narrow down the problem and figure out where the issue is. Can you set up a mock server and see if your code can communicate with it? Can you set up a Java or Python grpc client to talk to the remote server?
Yakushev Aleksey
@AlexGruPerm
I can't create mock server. But I use BloomRPC (to check my token and requests)
image.png