Failed to create Hbase table using Hive query via Spark

Using the following tutorial: https://hadooptutorial.info/hbase-integration-with-hive/ I was able to integrate HBase with Hive. After configuration, I was able to successfully create the Hbase table using the Hive query with the Hive table mapping.

Bushes Query:

CREATE TABLE upc_hbt(key string, value string) 
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,value:value")
TBLPROPERTIES ("hbase.table.name" = "upc_hbt");

      

spark Scala:

val createTableHql : String = s"CREATE TABLE upc_hbt2(key string, value string)"+
      "STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'"+
      "WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,value:value')"+
      "TBLPROPERTIES ('hbase.table.name' = 'upc_hbt2')"

    hc.sql(createTableHql)

      

But when I execute the same Hive request through Spark it throws the following error:

Exception in thread "main" org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hive.ql.metadata.HiveException: Error in loading storage handler.org.apache.hadoop.hive.hbase.HBaseStorageHandler

      

It looks like while running Hive through Spark, it cannot find the location of the auxpath container. Is there a way to solve this problem?

Thank you in advance.

+3


source to share





All Articles