Unresolved dependency path for SBT project in IntelliJ

I am using IntelliJ to develop a Spark application. I am following this tutorial on how to make intellij work nicely with an SBT project.

As my whole team uses IntelliJ so we can just change build.sbt but we got this unresolved dependency error

Error: Error while importing SBT project:

[info] Resolving org.apache.thrift # libfb303; 0.9.2 ...
[info] Resolving org.apache.spark # spark-streaming_2.10; 2.1.0 ...
[info] Resolving org.apache.spark # spark-streaming_2.10; 2.1.0 ...
[info] Resolving org.apache.spark # spark-parent_2.10; 2.1.0 ...
[info] Resolving org.scala-lang # jline; 2.10.6 ...
[info] Resolving org.fusesource.jansi # jansi; 1.4 ...
[warn] ::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::
[warn] :: sparrow-to-orc # sparrow-to-orc_2.10; 0.1: not found
[warn] ::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] sparrow-to-orc: sparrow-to-orc_2.10: 0.1
[warn] + - mainrunner: mainrunner_2.10: 0.1-SNAPSHOT
[trace] Stack trace suppressed: run 'last mainRunner /: ssExtractDependencies' for the full output.
[trace] Stack trace suppressed: run 'last mainRunner /: update' for the full output.
[error] (mainRunner /: ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc # sparrow-to-orc_2.10; 0.1: not found
[error] (mainRunner /: update) sbt.ResolveException: unresolved dependency: sparrow-to-orc # sparrow-to-orc_2.10; 0.1: not found
[error] Total time: 47 s, completed Jun 10, 2017 8:39:57 AM

And this is my build.sbt

name := "sparrow-to-orc"

version := "0.1"

scalaVersion := "2.11.8"

lazy val sparkDependencies = Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0",
  "org.apache.spark" %% "spark-sql" % "2.1.0",
  "org.apache.spark" %% "spark-hive" % "2.1.0",
  "org.apache.spark" %% "spark-streaming" % "2.1.0"
)

libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"

libraryDependencies ++= sparkDependencies.map(_ % "provided")

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

assemblyMergeStrategy in assembly := {
  case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
  case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case "overview.html" => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))

      

If I don't have this line, the program works fine

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

      

But then I won't be able to run the application inside IntelliJ as spark dependencies won't be included in the classpath.

+3


source to share





All Articles