How do I run the Spark driver in HA mode?

I have a Spark driver shipped to a Mesos cluster (with highly available Mesos masters) in client mode (see this for client

deployment mode).

I also want to run Spark driver in HA mode. How?

I can implement my own implementation for this, but I'm looking for something available right now.

+3


source to share


1 answer


tl; dr Use cluster

deployment mode with --supervise

eg.spark-submit --deploy-mode cluster --supervise

Having the HA driver Spark in mode is client

not possible as described in the cited document :

In client mode, the Spark Mesos framework runs directly on the client machine and waits for the driver to exit.



You need to monitor the process on the client machine in some way and possibly check its exit code.

A safer solution is to let Mesos do his job. You have to use cluster

the deployment mode, in which it needs Mesos, to make sure the driver works (and restarts when changed). See Section Cluster Mode :

Spark on Mesos also supports cluster mode where the driver runs in a cluster and the client can find the driver results from the Mesos Web interface.

+1


source







All Articles