Copy files from S3 to EC2 instance using Boto3 (script on local server)?

I am running a python script using Boto3 (first time using boto / 3) on my local server, which keeps track of the S3 bucket for new files. When it detects new files in the bucket, it starts a stopped EC2 instance, which loads the software to process those files, and then somehow instructs S3 / EC2 to copy the new files from S3 to EC2. How can I achieve this with a Boto3 script that is executed on my local server?

Essentially, the local work script is the organizer of the process and has to start an instance when there are new files to process and process them on the EC2 instance and copy the processed files back to S3. I am currently trying to figure out how to get files copied to EC2 from S3 using a script that is executed locally. I would like to avoid downloading from S3 to a local server and then downloading to EC2.

Suggestions / ideas?

+3


source to share


3 answers


You should consider using Lambda for any S3 based event handling. Why get servers up and running when you don't need to?



+3


source


If the bucket name and other parameters don't change, you can achieve this simply by pointing a script to your EC2 instance, which would pull the latest content out of the bucket and set that script to run every time EC2 starts up.



If the s3 command options change and you have to run them from your local machine using boto, you need to find a way to ssh into an EC2 instance using boto. Check this module: boto.manage.cmdshell and similar question: Boto Execute shell command on ec2 instance

0


source


Is the problem solved? .Dd tried with sqs .. I have a similar requirement

0


source







All Articles