How to create an s3 bucket with Boto3?
I want to enable cloudtrail logs for my account and therefore you need to create an s3 bucket. I need to automate this task with Boto3.Currently I am using the following script
sess = Session(aws_access_key_id=tmp_access_key, aws_secret_access_key=tmp_secret_key, aws_session_token=security_token) s3_conn_boto3 = sess.client(service_name='s3', region_name=region) bucket = s3_conn_boto3.create_bucket(Bucket=access_log_bucket_name, CreateBucketConfiguration={'LocationConstraint':'us-east-1'}, ACL='authenticated-read',..).
I'm new to Boto3 so I don't have a lot of knowledge regarding the use of other parameters like GrantWrite , GrantWriteACP , etc.
Please help me provide a code snippet regarding s3 bucket creation and included cloudtrail logs.
thank
source to share
First, in boto3, if you are configuring security with "aws configure", you do not need to declare this "sess" section ( http://docs.aws.amazon.com/cli/latest/userguide/cli-chap- getting-started.html )
# if you already done aws configure
import boto3
s3 = boto3.client("s3")
s3.create_bucket(Bucket="mybucket", ....)
Secondly, it is bad boto3 documentation that does not link the correct information. This can be found in the boto3 pdf section, page 2181 ( https://media.readthedocs.org/pdf/boto3/latest/boto3.pdf )
Email . The value in the Grantee object is the registered email address of the AWS account.
Grantee : The AWS user or group that you want to access the transcoded files and playlists to. To define a user or group, you can specify the canonical user ID for the AWS account, source access identity for CloudFront distribution, the registered email address of the AWS account or Amazon S3 predefined group
And the easiest is to use the policy setting ( http://support.cloudcheckr.com/getting-started-with-cloudcheckr/preparing-your-aws-account/aggregate-cloudtrail/ ). You can convert all stuff with put_bucket_policy (), skip the awful GrantWrite, GrantWriteACP
source to share
Go through the following documentation
http://boto3.readthedocs.io/en/latest/guide/migrations3.html
Create a connection
Boto 3 has both low-level clients and higher-level resources. For Amazon S3, higher-level resources are most similar to the Boto 2.x s3 module:
Boto 2.x import boto
s3_connection = boto.connect_s3()
Boto 3
import boto3
s3 = boto3.resource('s3')
Bucket creation
Bucket creation in Boto 2 and Boto 3 is very similar, except that in Boto 3 all action parameters must be passed through keyword arguments and the bucket configuration must be specified manually:
Boto 2.x
s3_connection.create_bucket('mybucket')
s3_connection.create_bucket('mybucket', location=Location.USWest)
Boto 3
s3.create_bucket(Bucket='mybucket')
s3.create_bucket(Bucket='mybucket', CreateBucketConfiguration={
'LocationConstraint': 'us-west-1'})
Data storage
Saving data from a file, stream, or string is easy:
Boto 2.x
from boto.s3.key import Key
key = Key('hello.txt')
key.set_contents_from_file('/tmp/hello.txt')
Boto 3
s3.Object('mybucket', 'hello.txt').put(Body=open('/tmp/hello.txt', 'rb'))
source to share
import boto3
client = boto3.client('s3')
response = client.create_bucket(
ACL='private'|'public-read'|'public-read-write'|'authenticated-read',
Bucket='string',
CreateBucketConfiguration={
'LocationConstraint': 'EU'|'eu-west-1'|'us-west-1'|'us-west-2'|'ap-south-1'|'ap-southeast-1'|'ap-southeast-2'|'ap-northeast-1'|'sa-east-1'|'cn-north-1'|'eu-central-1'
},
GrantFullControl='string',
GrantRead='string',
GrantReadACP='string',
GrantWrite='string',
GrantWriteACP='string')
source to share