How do I load hiveContext into Zeppelin?

I am new to zeppelin notebook. But one thing I noticed is that unlike the spark shell, the hiveContext is not automatically created in zeppelin when I start the laptop.

And when I tried to manually load the hiveContext in zeppelin like:

import org.apache.spark.sql.hive._
import org.apache.spark.sql.hive.HiveContext

val hiveContext = new HiveContext(sc)

      

I am getting this error

java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:204)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
    at java.lang.reflect.Constructor.newInstance(Unknown Source)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
    at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
    at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
    at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
    at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:226)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:229)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)

      

I think the error means that the previous metastore_db does not allow overriding the new one.

I am using spark 1.6.1

Any help would be appreciated.

+3


source to share


3 answers


check your metastore_db permission ... then you check in REPL mode. then you need to move zeppelin.



0


source


Could you please try to connect Hive from shell. I just wanted you to check if Hive is installed correctly because I had a similar problem several times ago. Also try connecting Hive from the Scala shell. If it works, then it must be powered by Zeppelin.



0


source


try to create HIVE context like this:

PYSPARK CODE.

sc = SparkContext (conf = conf)

sc._jvm.org.apache.hadoop.hive.conf.HiveConf ()

hiveContext = HiveContext (sc)

Hope it helps.

Hello,

Neeraj

0


source







All Articles