How do I load my uber-jar into a spark cluster created using spark-ec2 scripts?
I am creating an EC2 Spark cluster in one line of code using
I am also using
(Spark 1.2.0) so it
My problem is stupid, how do I copy my jar to the master?
- The aws cli doesn't seem to be configured correctly, so I can't use s3 directly.
- I can make a bank public and just start it, but this is bad practice.
- I tried to connect to the master from outside, but it blocked
What is the best practice for sending uber jars to spark off-line Amazon ec2 cluster running through
source to share
No one has answered this question yet
Check out similar questions: