These are chat archives for sbt/sbt

24th
Aug 2017
nafg
@nafg
Aug 24 04:53
What's the general procedure for publishing an sbt plugin for 1.0?
I wante to take a stab at scalacenter/scalajs-bundler#150 but don't know what I'm doing
rcriten
@rcriten
Aug 24 11:41
I have a multi-project sbt setup where the last stage is to generate a bill of materials. I've created a separate sub-module to generate the bill of materials and it works as expected when run AFTER all the other modules have finished publishing. However, if I run this from a clean project the bill of materials is incorrectly generated. So, I need to ensure that the bill-of-materials sub-project is run after every other module has finished publishing. What is the best way to accomplish this. I've tried using the dependsOn(...) setting, but it over-writes the library dependencies such that the moduleID is incorrect (doesn't match the maven publish setting).
Justin Kaeser
@jastice
Aug 24 12:06
has anyone ever... run into ARG_MAX limitations for forked sbt processes. and found a workaround?
@rcriten how did you try it? it should look something like (generateBom in bomProject).dependsOn(publish in otherProject)
rcriten
@rcriten
Aug 24 12:10
That is the one syntax I did not try. I'll work with that and see how it goes. Thanks.
rcriten
@rcriten
Aug 24 12:22
I tried lazy val bom = (project in file("bom")).dependsOn(utils % "publish, common % "publish) This yields the error "Cannot add dependency to configuration publish because this configuration does not exist. I then tried lazy val bom = (.....).dependsOn(publish in utils) yields a compilation error.
Justin Kaeser
@jastice
Aug 24 13:54
you can also try including a reference to publish.value from your bom task. also you might not actually want publish but instead packageBin which gives you the file. you might want to open a SO question with some more details
eugene yokota
@eed3si9n
Aug 24 13:58
@nafg the general procedure has not changed since 0.13. if you want to contribute help, the first thing you can try is use 0.13.16 to cross build the plugin against sbt 1.0.0, and send a PR - http://www.scala-sbt.org/0.13/docs/sbt-0.13-Tech-Previews.html#sbt-cross-building
you can check out the commit history for plugins listed in https://github.com/sbt/sbt/wiki/sbt-1.x-plugin-migration
Vasiliy Levykin
@vvlevykin
Aug 24 14:00

Hello, it seems that after upgrading to 1.0 this line stops working:

watchSources += baseDirectory.value / "input"

It's from http://www.scala-sbt.org/1.x/docs/Basic-Def-Examples.html
I'm getting:

error: No implicit for Append.Value[Seq[sbt.internal.io.Source], java.io.File] found, so java.io.File cannot be appended to Seq[sbt.internal.io.Source]

Did this change?

eugene yokota
@eed3si9n
Aug 24 14:00
@vvlevykin yes. there will be a patch release to fix that soon
Vasiliy Levykin
@vvlevykin
Aug 24 14:01
I see, thanks
eugene yokota
@eed3si9n
Aug 24 14:03
in the meantime the workaround is: watchSources += new sbt.internal.io.Source(baseDirectory.value / "input", AllPassFilter, NothingFilter) according to sbt/sbt#3438
Rob Norris
@tpolecat
Aug 24 16:06
ok, so i have a scala-js project. in the blahJVM project test:compile::resourceDirectories says that .../.jvm/src/test/resources is where resources should live, yet when I test:compile they don't end up in target/scala-2.12/test-classes
what am i misunderstanding?
OlegYch
@OlegYch
Aug 24 16:26
perhaps compile is too specific
ie task axis
Rob Norris
@tpolecat
Aug 24 16:27
the norma src/main/resources directory doesn't work either
OlegYch
@OlegYch
Aug 24 16:27
certainly wouldn't make sense to specify resources setting for compilation
try 'test:resourceDirectorie'
Rob Norris
@tpolecat
Aug 24 16:28
same
Derek Wickern
@dwickern
Aug 24 16:28
is there an easy way to replace out a third-party dependency with a different organization/group id? Without going through and excluding the dependency everywhere?
alternatively, someone I can hire to break some knee caps and get my PRs merged
nevermind, I forgot excludeDependencies was a thing
Glen Marchesani
@fizzy33
Aug 24 17:36
I have a strange compile error converting code working in a project/Common.scala to be a plugin... And code compiles fine in project/Common.scala but not from the plugin project. Are there any implicits that are included when using project/Common.scala ??
Glen Marchesani
@fizzy33
Aug 24 17:41

well I take that back... I have something like this

      (resourceGenerators in Compile) += {
        val dir = (resourceManaged in Compile).value
        val file: File = dir / "version"
        IO.write(file, "foo")
        Seq(file)
      }

and getting this compile error from both the plugin project and trying from project/Common.scala

[error] /Users/glen/code/model3/project/Common.scala:50: No implicit for Append.Value[Seq[sbt.Task[Seq[java.io.File]]], Seq[sbt.File]] found,
[error]   so Seq[sbt.File] cannot be appended to Seq[sbt.Task[Seq[java.io.File]]]
[error]       (resourceGenerators in Compile) += {
[error]                                       ^
[error] one error found
any ideas would be appreciated
the following works from plugin/Common.scala but not when I have the same code in a plugin project
      (resourceGenerators in Compile) += Def.task[Seq[File]] {
        val dir = (resourceManaged in Compile).value
        val file: File = dir / "version"
        IO.write(file, "foo")
        Seq(file)
      }
Which brings me back to the original question about there being imports that implicitly has when compiled from project/Common.scala
OlegYch
@OlegYch
Aug 24 17:56
++=
actually nvm
there is no implicits included those you include yourself
*except those
Rob Norris
@tpolecat
Aug 24 17:59
(We resolved the resource thing by switching from CrossBuild.Pure to CrossBuild.Full.)
Suhas Gaddam
@suhasgaddam
Aug 24 18:04
OlegYch
@OlegYch
Aug 24 18:05
1.0 only supports 2.12
Suhas Gaddam
@suhasgaddam
Aug 24 18:07
:thumbsup: that makes sense then, Thanks.
Jordan Coll
@jordancoll
Aug 24 19:15
hey guys, been trying to understand how I can change the output dir for the default packageBin task.
I've got a multiproject build.sbt with the root project aggregating the subprojects.
packageBin depends on artifactPath , so I wanted to change that.
problem is, artifactPath should contain the artifactName of each subproject, which would be different for each project.
Is there a way to make artifactPath reference artifactName in whichever project it is currently running?
OlegYch
@OlegYch
Aug 24 19:17
it does that by default
Jordan Coll
@jordancoll
Aug 24 19:41
so
artifactPath in ThisBuild := file("output") / artifactName.value(scalaVersion.value, moduleID.value, artifact.value)
OlegYch
@OlegYch
Aug 24 19:45
i guess?
i think ThisBuild is evil, just specify it in each project
Jordan Coll
@jordancoll
Aug 24 19:47
well, ideally I'd like to specify it only once rather than c&p
OlegYch
@OlegYch
Aug 24 19:47
don't c&p use functions
Jordan Coll
@jordancoll
Aug 24 19:48
but also in ThisBuild, show artifactPath says they're all called root.jar =(
OlegYch
@OlegYch
Aug 24 19:48
eg val commonSetting = Seq(a := b)
project.settings(commonSettings)
Jordan Coll
@jordancoll
Aug 24 19:49
yarrrr