2

I am using IntelliJ 2016.3 version.

import sbt.Keys._
import sbt._

object ApplicationBuild extends Build {

  object Versions {
    val spark = "1.6.3"
  }

  val projectName = "example-spark"

  val common = Seq(
    version := "1.0",
    scalaVersion := "2.11.7"
  )

  val customLibraryDependencies = Seq(
    "org.apache.spark" %% "spark-core" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-hive" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-streaming" % Versions.spark % "provided",

    "org.apache.spark" %% "spark-streaming-kafka" % Versions.spark
      exclude("log4j", "log4j")
      exclude("org.spark-project.spark", "unused"),

    "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0",

    "org.slf4j" % "slf4j-api" % "1.7.10",

    "org.slf4j" % "slf4j-log4j12" % "1.7.10"
      exclude("log4j", "log4j"),

    "log4j" % "log4j" % "1.2.17" % "provided",

    "org.scalatest" %% "scalatest" % "2.2.4" % "test"
  )

I have been getting below run time exception., even though i mentioned all the dependencies correctly as shown above.Libraries- screen shot

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext
    at example.SparkSqlExample.main(SparkSqlExample.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SQLContext
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 6 more

Investigated more on this web.And found that this is mainly due to in-appropriate entries in buld.sbt or version mismatches.But in my case everything looks good as shown above. Please suggest where did i do wrong here?

2
  • Shouldn't you be using spark-sql_2.11 and such? Commented Jun 20, 2017 at 7:07
  • @philantrovert! Since we are using %% while mentioning dependencies, sbt is intelligent enough to append the scala version with an underscore. As we mentioned above scalaVersion := "2.11.7" ., sbt get it as 2.11 and append it to the dependency as spark-sql_2.11 finally. Commented Jun 20, 2017 at 7:19

1 Answer 1

5

I guess this is because you marked your dependencies as "provided", but apparently you (or IDEA) don't provide them.

Try to remove the "provided" option or (my preferred way): move the class with the main method to src/test/scala

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.