How do I load my uber-jar into a spark cluster created using spark-ec2 scripts?

I am creating an EC2 Spark cluster in one line of code using SPARK_HOME/ec2/spark-ec2

I am also using --copy-aws-credentials

(Spark 1.2.0) so it sc.textFile("s3...")

works well

My problem is stupid, how do I copy my jar to the master?

  • The aws cli doesn't seem to be configured correctly, so I can't use s3 directly.
  • I can make a bank public and just start it, but this is bad practice.
  • I tried to connect to the master from outside, but it blocked

What is the best practice for sending uber jars to spark off-line Amazon ec2 cluster running through ec2/spark-ec2

?

+3


source to share





All Articles