How can I upload a "file" to S3 by creating a temporary file using AWS Lambda?
I am writing a lambda function whose purpose is to load a .json file from s3, change its contents, and then reload into the same bucket under a different key.
So in my s3 I have cloud cloud / folder / foo.json
>>> foo.json
{
"value1": "abc",
"value2": "123"
}
I want to download it, change a couple of things accordingly, and re-download it at the same location as bar.json
I have the first piece of work in which it loads the content of the file and changes the content, but now it's all a dictionary-dictionary.
import boto3
import json
def get_json():
client = boto3.client('s3')
response = client.get_object(Bucket='cloud', Key='folder/foo.json')
data = response['Body'].read()
bar = json.loads(data)
bar["value-1"] = "do-re-mi"
#TODO: implement uploading here
def lambda_handler(event, context):
get_json()
return 'Hello from Lambda'
So now ...
>>> bar
{
"value1": "do-re-mi",
"value2": "123"
}
The variable bar is correct, but it is a dictionary object. How can I directly load bar.json into this bucket? I've seen other examples here, but I'm not interested in embedding my AWS secret or access keys anywhere. I am assuming that I am using lambda. I cannot create the file on the machine when I try to do something like below:
g = open('myfile.json', 'w')
g.write(json.dumps(bar, indent=4, sort_keys=True))
g.close()
with open('myfile.json', 'rb') as f:
client.upload_fileobj(f, 'cloud', 'bar.json')
I get "errorType": "IOError", "errorMessage": "[Errno 30] Read-only filesystem: 'myfile.json'"
Any advice would be greatly appreciated. Thank!
source to share
Thanks to monchitos82 I learned that you can write to / tmp in a lambda. So all I had to do was add this to the top of my files and it worked.
g = open('/tmp/myfile.json', 'w')
g.write(json.dumps(bar, indent=4, sort_keys=True))
g.close()
with open('/tmp/myfile.json', 'rb') as f:
client.upload_fileobj(f, 'cloud', 'bar.json')
source to share
Apparently you don't even need to write a temporary file; Key.open_write
seems to give you a writable file that you can .dump
your JSON to. I'm still not sure if it is currently implemented in AWS.
There key.set_contents_from_string
, which should work if you have enough spare RAM for .dumps()
.
source to share