How to pass environment variable to spark-submit
I am using apache-spark 1.2.0 and want my custom environment variable to $MY_KEY
be available to my Java job when executed withmaster=local
In a Java environment, this can be passed using a parameter -D
, but I cannot recognize it when I start my driver withspark-submit
I tried adding this to conf/spark-defaults.conf
, but spark will not resolve the $ MY_KEY environment variable when it does my Java job (I see this in my logs)
spark.driver.extraJavaOptions -Dkeyfile="${MY_KEY}"
I tried adding the same as the argument when calling spark-submit, but that doesn't work either.
Same problem adding it to conf/spark-env.sh
The only way I've worked with is to edit a bin/spark-submit
script that defeats the purpose of reading it from an existing environment variable and will be overwritten when spark is updated.
So it seems to me that spark-submit ignores the environment variables of the current user and allows a limited subset of variables to be defined in the conf files. Does anyone know how I can solve this?
source to share
No one has answered this question yet
See similar questions:
or similar: