Exception after setting property "spark.sql.hive.metastore.jars" in "spark-defaults.conf"

Below is the version of Spark and Hive that I have installed on my system

Spark :spark-1.4.0-bin-hadoop2.6

Beehive :apache-hive-1.0.0-bin

I have set up a Hive installation to use MySQL as Metastore. The goal is to access MySQL Metastore and execute HiveQL queries internally spark-shell

(using HiveContext

)

For now, I can execute HiveQL queries by accessing the Derby Metastore (as described here , believe Spark-1.4 comes bundled with Hive 0.13.1 , which in turn uses the internal Derby database as Metastore)

Then I tried to point spark-shell

to my external Metastore (in this case MySQL) by setting the property (as suggested here ) below to $SPARK_HOME/conf/spark-defaults.conf

,

spark.sql.hive.metastore.jars   /home/mountain/hv/lib:/home/mountain/hp/lib

      

I also copied $HIVE_HOME/conf/hive-site.xml

to $SPARK_HOME/conf

. But I am getting the following exception when I runspark-shell

    mountain@mountain:~/del$ spark-shell 
    Spark context available as sc.
    java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/ql/session/SessionState when creating Hive client 
using classpath: file:/home/mountain/hv/lib/, file:/home/mountain/hp/lib/
    Please make sure that jars for your version of hive and hadoop are 
included in the paths passed to spark.sql.hive.metastore.jars.

      

Am I missing something (or) did I set the property correctly spark.sql.hive.metastore.jars

?

+3


source to share





All Articles