These are chat archives for sbt/sbt-native-packager

7th
Aug 2017
Leonard Daume
@ldaume
Aug 07 2017 12:17
Hi, I'm trying to build a spark app as Java Application Archetype with universal:packageBin as always. The problem is that it fails with [error] (*:packageOptions) Please add any Spark dependencies by supplying the sparkVersion and sparkComponents. Please remove: org.apache.spark:spark-core:2.2.0 :disappointed: . Any hint ?
my sbt settings:
lazy val root = (project in file(".")).
  enablePlugins(JavaAppPackaging, DockerPlugin).
  settings(
    sparkVersion := "2.2.0",
    sparkComponents := Seq("core", "sql", "catalyst", "mllib"),

...
libraryDependencies ++= Seq(
...
      "org.apache.spark" %% "spark-core" % "2.2.0",
...
A. Alonso Dominguez
@alonsodomin
Aug 07 2017 12:21
@ldaume how are you intending to run it? Spark apps are usually packaged as Uber jars and then submitted to the cluster. You should probably look at sbt-assembly.
or this very tiny plugin that I wrote which uses sbt-assembly under the hood… https://github.com/alonsodomin/sbt-spark
A. Alonso Dominguez
@alonsodomin
Aug 07 2017 12:37
oh, I see, you are using sbt-spark-packages in there...
still can't understand what you're trying to achieve with your setup
but the error message you're a getting is due to the fact of having spark-core in your libraryDependencies
the message comes from sbt-spark-packages, no sbt-native-packager
Leonard Daume
@ldaume
Aug 07 2017 13:02
Thanks @alonsodomin
Nepomuk Seiler
@muuki88
Aug 07 2017 17:37
@alonsodomin is right. At my company we are using native-packager for our services, but sbt-assembly for spark as it expects uber jars