Compilation errors in Spark (DataTypeConversions.scala) on Intellij when using Maven

Since roughly 7/30/14, I have not been able to compile Spark head in Intellij. Has anyone encountered this / found a workaround?

Error:scalac: 
     while compiling: /d/funcs/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala
        during phase: jvm
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args: -classpath :/shared/jdk1.7.0_25/jre/classes:/home/steve/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
  last tree to typer: Literal(Constant(org.apache.spark.sql.catalyst.types.PrimitiveType))
              symbol: null
   symbol definition: null
                 tpe: Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
       symbol owners: 
      context owners: anonymous class anonfun$asScalaDataType$1 -> package util
== Enclosing template or block ==
Template( // val <local $anonfun>: <notype>, tree.tpe=org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
  "scala.runtime.AbstractFunction1", "scala.Serializable" // parents
  ValDef(
    private
    "_"
    <tpt>
    <empty>
  )
  // 3 statements
  DefDef( // final def apply(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField
    <method> final <triedcooking>
    "apply"
    []
    // 1 parameter list
    ValDef( // javaStructField: org.apache.spark.sql.api.java.StructField
      <param> <synthetic> <triedcooking>
      "javaStructField"
      <tpt> // tree.tpe=org.apache.spark.sql.api.java.StructField
      <empty>
    )
    <tpt> // tree.tpe=org.apache.spark.sql.catalyst.types.StructField
    Apply( // def asScalaStructField(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField in object DataTypeConversions, tree.tpe=org.apache.spark.sql.catalyst.types.StructField
      DataTypeConversions.this."asScalaStructField" // def asScalaStructField(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField in object DataTypeConversions, tree.tpe=(javaStructField: org.apache.spark.sql.api.java.StructField)org.apache.spark.sql.catalyst.types.StructField
      "javaStructField" // javaStructField: org.apache.spark.sql.api.java.StructField, tree.tpe=org.apache.spark.sql.api.java.StructField
    )
  )
  DefDef( // final def apply(v1: Object): Object
    <method> final <bridge>
    "apply"
    []
    <snip>
        DataTypeConversions$$anonfun$asScalaDataType$1.super."<init>" // def <init>(): scala.runtime.AbstractFunction1 in class AbstractFunction1, tree.tpe=()scala.runtime.AbstractFunction1
        Nil
      )
      ()
    )
  )
)
== Expanded type of tree ==
ConstantType(
  value = Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
)
uncaught exception during compilation: java.lang.AssertionError

      

+3


source to share


2 answers


I have resorted to recursively removing all traces of intellij

find . -name \*.iml | xargs rm -f

      



and then starting from scratch with pom.xml in the root / parent directory. Everything worked again.

It looks like intellij.iml files may have some strange state / corruption.

+1


source


According to: http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html



try running: sbt clean

+3


source







All Articles