Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 19:39
    kKdH starred scalapb/ScalaPB
  • Jan 31 19:36
    raboof synchronize #541
  • Jan 31 18:58
    raboof synchronize #541
  • Jan 31 18:55
    raboof synchronize #541
  • Jan 31 17:14
    raboof commented #510
  • Jan 31 17:13
    raboof opened #542
  • Jan 31 16:01
    da-edra starred scalapb/ScalaPB
  • Jan 31 15:51
  • Jan 31 15:42
    raboof opened #541
  • Jan 31 15:39
  • Jan 31 13:11
    raboof commented #510
  • Jan 31 12:50
    raboof edited #540
  • Jan 31 09:11
    raboof opened #540
  • Jan 30 21:40
    codingismy11to7 opened #539
  • Jan 29 17:11
    jacktok starred scalapb/ScalaPB
  • Jan 28 15:11
    windbird123 starred scalapb/ScalaPB
  • Jan 27 16:32
    duxiaofeng-github starred scalapb/ScalaPB
  • Jan 26 21:34

    thesamet on master

    sbt 1.2.8 (#538) (compare)

  • Jan 26 21:34
    thesamet closed #538
  • Jan 26 19:37
    sullis opened #538
Alex Henning Johannessen
@ahjohannessen
@thesamet would it require much effort to allow collection_type as a global plugin setting in addition to package or file option, really want List instead of Seq globally for all repeated fields
Nadav Samet
@thesamet
@ahjohannessen Effort is low, the problem is not the effort though, but the kind of issues that enabling this brings. See https://scalapb.github.io/faq.html#why-a-certain-customization-is-not-available-as-a-global-generator-parameter
Is there a concern about having a top-level package option that would apply globally?
Alex Henning Johannessen
@ahjohannessen
@thesamet mostly just the repeated boilerplate when we have hundreds of proto files. I get the reason why it might be a bad idea in general.
If I define a package.proto with package a.b; and define collection_type: “List” would it apply to other files with package a.b.c; - I guess not, have not seen the code.
Alex Henning Johannessen
@ahjohannessen

Ah, just noticed this in the docs:

All the options in this file will be applied to all proto files in the package com.mypackage and its sub-packages

Nadav Samet
@thesamet
@ahjohannessen yes, it's cascading to sub-packages. So one package option and you are done for the entire project. I hope this works for you!
Alex Henning Johannessen
@ahjohannessen
@thesamet I think that helps, I’ll try it out :)
Ukonn Ra
@UkonnRa
Hi, I am new to ScalaPB and sbt-assembly, here I have a problem lots of people may encountered before... When using sbt run or even sbt pack, the ScalaPB works fine, but when using sbt assembly, errors show:
Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional server found. Try adding a dependency on the grpc-netty or grpc-netty-shaded artifact
Anyone facing this error?
Here is my assemblySettings:
lazy val assemblySettings = Seq(
  test in assembly := {},
  mainClass in assembly := Some("io.grpc.examples.helloworld.HelloWorldServer"),
  assemblyOption in assembly := (assemblyOption in assembly).value
    .copy(includeScala = true, includeDependency = true),
  assemblyMergeStrategy in assembly := {
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case "application.conf"            => MergeStrategy.concat
    case _                             => MergeStrategy.first
  },
)
And depedencies:
      "io.grpc"              % "grpc-netty-shaded"            % scalapb.compiler.Version.grpcJavaVersion,
      // "io.grpc"              % "grpc-netty"            % scalapb.compiler.Version.grpcJavaVersion,
      "com.thesamet.scalapb" %% "scalapb-runtime"      % scalapb.compiler.Version.scalapbVersion % "protobuf",
      "com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapb.compiler.Version.scalapbVersion
Both grpc-netty-shaded and grpc-netty tried, but neither helps
Nadav Samet
@thesamet
@UkonnRa I don't think I have seen this before. If you can prepare a repo with a minimal example I can try, I'd be happy to help.
Ukonn Ra
@UkonnRa
@thesamet Sorry for late. I created a mini project to show my problem: https://github.com/UkonnRa/scalapb-test
I guess the problem is that I set the wrong assemblyMergeStrategy
But I'm not familiar at sbt-assembly at all, so I cannot figure out where the problem is
sad...
Nadav Samet
@thesamet

@UkonnRa For your test project, this works for me:

assemblyMergeStrategy in assembly := {
    case x if x.contains("io.netty.versions.properties") => MergeStrategy.discard
    case x =>
        val oldStrategy = (assemblyMergeStrategy in assembly).value
        oldStrategy(x)
}

I also sent you a PR. This SO answer was useful: https://stackoverflow.com/a/54634225/97524

Ukonn Ra
@UkonnRa
@thesamet It works! Thx!!!
But why? Why should I set the settings like this? Why should I discard io.netty.versions.properties?
Nadav Samet
@thesamet
@UkonnRa TBH, this is a good question to ask sbt-assembly / netty developers. I'd like to know the answer to, but it's unrelated to ScalaPB
Ukonn Ra
@UkonnRa
Sure, THX a lot!
Alex Henning Johannessen
@ahjohannessen
@thesamet Have you published sbt-protoc for 2.13.0?
Louise Oram
@lcoram
https://github.com/lcoram/multimoduleapi --- Have created a basic example, where I want to have the proto files compiled into a common module. However, I currently cannot get it to compile if I add .dependsOn(proto) ...
Alex Henning Johannessen
@ahjohannessen
@thesamet nevermind me, I suppose sbt 1.x is only for 2.12.x :)
Nadav Samet
@thesamet

@lcoram It's an SBT gotcha. In sbt, try:

[testcomp] $ module1/scalaVersion
[info] 2.13.0
[testcomp] $ proto/scalaVersion
[info] 2.12.7

See that each has a different scala version? You need to set the scala version for all sub-projects, by changing line 6 in build.sbt to

scalaVersion in ThisBuild := "2.13.0"
Louise Oram
@lcoram
Oh dear, I had read about that but thought i was dealing with it by setting the scala version in each sub module's build.sbt
Louise Oram
@lcoram
Ok, so now it compiles... but where are the .scala files created from the proto? The idea was to try to wrap them into the common module, then that gets depended on by the other modules (but this is a skeleton example so there isn't much there).
(this is where in the larger version going sbt compile first, then sbt run seems to work...)
Nadav Samet
@thesamet
In SBT you can type proto/protocSources to see where it picks up protos to compile, and proto/protocSources to see where it would generate to. If you inspect those directories, you'll see that they are probably not what you meant them to be.
Matheus Hoffmann
@Hoffmannxd
Hello folks, i would like to know what you think about allow the user to define default instance .. in some cases the default value of primary types (string, int, bool) may not be desired. In a real example, im using a type mapper java.util.UUID <-> string, and the function UUID.fromString is throwable and the default instance lead to exceptions since the default value of string is "". I used a Try inside the type mapper but i would like to know if would be possible to define, in this case, the default value of the string as a UUID nil instance ("00000000-0000-0000-0000-000000000000")
Nadav Samet
@thesamet
@Hoffmannxd The type mapper function is expected to be total (returns a value for all inputs, no exceptions). Using a Try and returning a default UUID is fair. You can also consider mapping to Option[UUID] and return None on error.
Matheus Hoffmann
@Hoffmannxd
@thesamet Thanks!
Louise Oram
@lcoram
@thesamet Thanks for the help, think I figured things out!
Taeguk Kwon
@taeguk
Excuse me, I have one issue.
For using any file-level option like no_default_values_in_constructor, I must import scalapb/scalapb.proto. But in my case, same proto files are shared among several projects written by various languages. So if I import scalapb/scalapb.proto in proto files, projects not written by scala are failed to build.
Anyone who know a way to solve it?
Nadav Samet
@thesamet
@taeguk you can generate code for scalapb.proto for these other languages
alternatively, if you're using it as a package-level option, you can try to exclude that proto file for the other languages.
Taeguk Kwon
@taeguk
@thesamet I had the way you suggested (package-level option). But, currently package-level option must require package keyword in package.proto
But my many proto files don't use package keyword. So, I can't utilize package-level option.
Nadav Samet
@thesamet
Got it, the first option should work then (generating code for scalapb.proto for the other languages)
Taeguk Kwon
@taeguk
@thesamet Right. By the way, is it bad to add no_default_values_in_constructor to not only file-level option but also generator parameter?
That option is for utilizing type system rather than applying something to partial proto files. So, IMHO, if someone want to use the option, maybe he/she always want to apply that option to all proto files through generator parameter.
Taeguk Kwon
@taeguk
Okay I understand
bifunctor
@bifunctor
Does ScalaPB provide bidirection streams?