These are chat archives for sbt/sbt

6th
Jun 2017
Alessandro Vermeulen
@spockz
Jun 06 08:03

@dwijnand I’ve tried the following below:

import sbt.inc.Analysis
import Process._
.settings(
      compile := {
        "mvn compile" ! log
        Analysis.Empty
      }
    )

But I get

/Users/alessandrovermeulen/sources/sdk/unirepotest/build.sbt:14: error: value ! is not a member of String
        "mvn compile" ! log
                      ^
Gilad Hoch
@hochgi
Jun 06 08:35

Hi, how would I ensure atomicity of constrained tasks?
What do I mean? here's my usecase:
We have a system built with many processes (cassandra,elasticsearch,kafka,etc'...)
we have a multi-project with 2 modules for integrations tests.
each of these projects need to launch all our processes to test against.
we do this with:

testOptions in IntegrationTest ++= {
  var pids = Array.empty[String]
  Seq(Tests.Setup(() => {
      pids = launchAllProcessesAndGetPIDs()
    }),
    Tests.Cleanup(() => {
      killPIDs(pids)
    }
  )
}

well, since many other projects have tests against parts of the system (may only launch e.g: cassandra, but not other processes),
I figured we can optimize our tests running time if I use Tags to constrain such tasks from running concurrently.
But, TestOption is not a task I can tag. So I tried the following hack:

def launchAllProcessesAndGetPIDsTask = Def.taskDyn[Array[String]] {
  Def.task[Array[String]] {
    launchAllProcessesAndGetPIDs()
  }.tag(Tags.Cassandra,Tags.Elasticsearch,Tags.Kafka,...)
}

and in Tests.Setup I call launchAllProcessesAndGetPIDsTask.value instead of invoking the method directly.
also, I got the test themselves tagged the same way:

test in IntegrationTest := Def.task {
  (test in IntegrationTest).value
}.tag(Tags.Cassandra,Tags.Elasticsearch,Tags.Kafka,...).value

So everything is tagged correctly, and I won't execute 2 tasks launching and testing against same process concurrently, but: I have no "atomicity".
I mean: after Tests.Setup finishes to run, any other task may run after it, even though I want to only test in IntegrationTest to run immediately after.

is there any way I can do that?

P.S. tried sequencing using Def.sequentialinstead:

test in IntegrationTest := Def.sequential (
  launchAllProcessesAndGetPIDsTask,
  (test in IntegrationTest),
  killTask
).tag(Tags.Cassandra,Tags.Elasticsearch,Tags.Kafka,...).value

but other than not be able to pass the PIDs to kill, it wouldn't work, and it seems to have been broken for quite some time now:
https://gitter.im/sbt/sbt/archives/2016/07/27

Alessandro Vermeulen
@spockz
Jun 06 08:39
@dwijnand argh, it was the normal ! from sys.process. I’m trying to use the ConsoleLogger() but I don’t get any output:
.settings(
      compile := {
        import sys.process._

        val logger = ProcessLogger(x => println(x), x => println(x))

        Process(Seq("mvn","compile"), baseDirectory.value) !! logger

        Analysis.Empty
      }
    )
@dwijnand ah, !! vs !
Dale Wijnand
@dwijnand
Jun 06 09:03
Don't import sys.process. There's one (the original one) in sbt already.
Gilad Hoch
@hochgi
Jun 06 09:33

also tried:

test in IntegrationTest := Def.taskDyn {
  val a: Task[Array[String]] = launchAllProcessesAndGetPIDsTask.taskValue
  val b: Task[Unit] = Def.task((test in IntegrationTest).value).tag(...).taskValue
  val c: Task[Unit] = Def.task {
    val pids = launchAllProcessesAndGetPIDsTask.value
    killPIDs(pids)
  }.tag(...).taskValue
  Def.task {
    ((a doFinally b) doFinally c).value
  }.tag(...)
}.tag(...)

but still, no atomicity. nothing will run concurrently that way, but as I said we have 2 sub-projects for integration tests, and order of execution was:

  • it1/launchAllProcessesAndGetPIDsTask
  • it2/launchAllProcessesAndGetPIDsTask
  • it1/it:test
  • ...

still can't ensure grouping of tasks to be run together... :confused:

Jorge
@jvican
Jun 06 09:44
@hochgi Interesting use case. I will look into it today.
Gilad Hoch
@hochgi
Jun 06 09:45
@jvican thanks! would be awesome! :)
Jorge
@jvican
Jun 06 09:46
Could you open an issue in sbt/sbt and describe your use case again?
It will be useful for future archeologists.
Gitter questions usually get lost :)
Gilad Hoch
@hochgi
Jun 06 09:47
sure thing!
Gilad Hoch
@hochgi
Jun 06 10:39
@jvican submitted:
sbt/sbt#3250
Josh
@joshlemer
Jun 06 14:45
I have an sbt project which depends on some XML files being in the resources directory, but those XML files come from a different git repo. Would SBT be the appropriate tool to use to, at build time, load those XMLs into src/main/resources, and if so, what would be the recommended way?
Josh
@joshlemer
Jun 06 15:15
Or basically, it looks like I want to have a git dependency in unmanagedResources
rather than a file
Josh
@joshlemer
Jun 06 16:47
Yeah maybe a resourceGenerator is what I would need..
Justin Kaeser
@jastice
Jun 06 16:59
http://www.scala-sbt.org/1.0/docs/Build-Loaders.html still references the deprecated Build trait. Is there any alternative?
Alessandro Vermeulen
@spockz
Jun 06 17:34
@dwijnand thanks, that made the code a bit cleaner. What are the advantages?
Justin Kaeser
@jastice
Jun 06 17:49
@joshlemer you may want to implement it as resourceGenerators: http://www.scala-sbt.org/0.13/docs/Howto-Generating-Files.html#Generate+resources
oh missed that you already found it
Josh
@joshlemer
Jun 06 17:51
Thanks for the help @jastice
nafg
@nafg
Jun 06 19:04
@joshlemer you could also use git submodules or subtrees
Josh
@joshlemer
Jun 06 19:06
@nafg don't have any experience with submodules, online google results seem to be pretty negative about them, maybe I'll take a look at subtrees. Is this the typical way that this would be solved, or would you typically try and use a maven repository for these resources?
nafg
@nafg
Jun 06 19:06
@joshlemer can you give a bit more background as to what sort of purpose the xml files serve and why they are in a separate repo?

FWIW https://stackoverflow.com/a/31770145/333643 says

With git submodules you typically want to separate a large repository into smaller ones. The way of referencing a submodule is maven-style - you are referencing a single commit from the other (submodule) repository. If you need a change within the submodule you have to make a commit/push within the submodule, then reference the new commit in the main repository and then commit/push the changed reference of the main repository. That way you have to have access to both repositories for the complete build.
With git subtree you integrate another repository in yours, including its history. So after integrating it, the size of your repository is probably bigger (so this is no strategy to keep repositories smaller). After the integration there is no connection to the other repository, and you don't need access to it unless you want to get an update. So this strategy is more for code and history reuse - I personally don't use it.

Josh
@joshlemer
Jun 06 19:12
A separate team uses this repo of XML files which basically configure a big oracle Indexer called Endeca. There are often changes to these XMLs that they do manually as business requirements change. We call these XML's "stubs" because they are the basic config XMLs for endeca.
Recently we have added support for adding certain features dynamically / automatically, which requires us to have a copy of the "stubs" at runtime, then fill in the dynamic sections with more XML.
However, that team still needs to be able to edit the configs manually to add static config changes etc, so that is why they have their own repo.
I guess I could push for the XML's to be moved into our project's repo, but in principle one could imagine multiple separate projects needing updated access to the XML's so would be nice to have a something that can handle that
nafg
@nafg
Jun 06 19:16
@joshlemer at runtime or at build time?
Josh
@joshlemer
Jun 06 19:18
Well at run-time we need them, but we would want to just take the XML's one time at build time to verify with testing etc. Wouldn't want to refetch the stubs more than once per build. But strictly speaking the build itself does not use them, other than to package them in the jar for use at runtime.
Justin Kaeser
@jastice
Jun 06 19:20
@joshlemer if you want to share them among projects, publishing them as a library is a reasonable approach. git dependencies can work too, as https://github.com/libling/sbt-hackling shows, but it won't handle your use-case out of the box
nafg
@nafg
Jun 06 19:20
I think they're all reasonable approaches, I don't think your use case is particularly suggestive of a particular approach
You could also just get it at runtime, even setting up a hook to know when to update
Justin Kaeser
@jastice
Jun 06 19:22
if you're feeling experimental, I encourage you to try to adapt the git-resolving logic in hackling to pull in the xml from a git repo and place it in your managedResources
Josh
@joshlemer
Jun 06 19:28
Thanks for the advice guys I'll see what I can do
A little hesitant to rely on an experimental library but we'll see :smile:
Adelbert Chang
@adelbertc
Jun 06 20:29
where can i get a list of valid Patterns for specifying resolver patterns? i see some http://www.scala-sbt.org/0.13/docs/Resolvers.html#Custom+Layout - is that all? I need one for the Scala version and SBT version
for an SBT plugin
since its published as [name]_[scala version]_[sbt version]
Adelbert Chang
@adelbertc
Jun 06 20:36
(ripgrep ftw)
Nicolas Rinaudo
@nrinaudo
Jun 06 20:58
the documentation seems to imply that it's possible to cross-link scaladoc documentations together: http://www.scala-sbt.org/0.13/docs/Howto-Scaladoc.html#Define+the+location+of+API+documentation+for+a+library
is this feature now gone?
I'm assuming that if it is, it's probably due to scaladoc rather than SBT, but maybe the doc needs to be updated
Nicolas Rinaudo
@nrinaudo
Jun 06 21:19
mmm.... looks light this might be an SBT issue after all...
SBT doesn't appear to pass the corresponding options to scaladoc anymore
Christopher Davenport
@ChristopherDavenport
Jun 06 22:50
For 1.0 did the name of the scripted-plugin artifact change?
Christopher Davenport
@ChristopherDavenport
Jun 06 23:09
Got it thankyou,
libraryDependencies ++= Seq(
  "org.scala-sbt" %% "scripted-sbt" % sbtVersion.value
)