How to get default property values ​​in Spark

I use this version of the Spark: spark-1.4.0-bin-hadoop2.6

. I want to check several default properties. So I gave the following statement inspark-shell

scala> sqlContext.getConf("spark.sql.hive.metastore.version")

      

I was expecting a method call getConf

to return a value 0.13.1

as described in this link . But I got the following exception

java.util.NoSuchElementException: spark.sql.hive.metastore.version
    at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
    at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)

      

Am I returning properties correctly?

+3


source to share


2 answers


you can use

sc.getConf.toDebugString

      

OR

sqlContext.getAllConfs

      

which will return all the values ​​that were set, however some defaults are specified in the code. In your specific example, this is indeed in code :



getConf(HIVE_METASTORE_VERSION, hiveExecutionVersion)

      

where the default is really in the code :

val hiveExecutionVersion: String = "0.13.1"

      

So, it getConf

will try to pull the metastar version from the config, returning to the default, but this is not indicated in the config itself.

+3


source


In Spark 2.xx If I wanted to know the default value for Spark Conf, I would do this:

Below command will return a Scala map in spark shell.

spark.sqlContext.getAllConfs 

      



To find our value for the conf property:

eg. - To find the default repository used by the spark installation, spark.sql.warehouse.dir:

spark.sqlContext.getAllConfs.get("spark.sql.warehouse.dir")

      

+2


source







All Articles