Large file upload stream with aws-sdk

Is there a way to stream large files to S3 using aws-sdk?

I can't figure it out, but I'm guessing there is a way. Thanks to

+3


source to share


2 answers


Update

My memory failed and I did not read the quote in my original answer correctly (see below) as shown in the API documentation for (S3Object, ObjectVersion) write (data, options = {}) :

Writes data to an object in S3. This method will try to intelligently choose between uploading in a single request or using #multipart_upload.

[...] You can pass : data or: file as the first argument or as parameters. [my accent]

The data parameter is the one that will be used for streaming, apparently:

:data (Object)

- Data to download. Valid values:

[...] Any object that responds to read and eof? ; the object must support the following accessors:

read                     # all at once
read(length) until eof?  # in chunks

      

If you specify data this way, you must also include: the content_length parameter .

[...]

:content_length (Integer)

- If provided, this parameter must match the total number of bytes written to S3 during the operation. This parameter is required if: the data is IO-like objects without a size method .

[my accent]

The resulting sample fragment might look like this:

# Upload a file.
key = File.basename(file_name)
s3.buckets[bucket_name].objects[key].write(:data => File.open(file_name), 
    :content_length => File.size(file_name))
puts "Uploading file #{file_name} to bucket #{bucket_name}."

      

Please note that I have not tested this so far, so be careful;)




Original Answer

This is explained in Loading an Object with the AWS SDK for Ruby :

Loading objects

  • Create an instance of the AWS :: S3 class by providing your AWS credentials.
  • Use the AWS :: S3 :: S3Object # write method, which accepts a data parameter and a parameter hash that allows you to load data from a file or stream . [my accent]

The page also contains a complete example that uses a file rather than a stream, but a relevant snippet:

# Upload a file.
key = File.basename(file_name)
s3.buckets[bucket_name].objects[key].write(:file => file_name)
puts "Uploading file #{file_name} to bucket #{bucket_name}."

      

It should be easy to configure to use a stream instead (if I remember correctly, you just need to replace the parameter file_name

with open(file_name)

- be sure to check this), for example:

# Upload a file.
key = File.basename(file_name)
s3.buckets[bucket_name].objects[key].write(:file => open(file_name))
puts "Uploading file #{file_name} to bucket #{bucket_name}."

      

+6


source


I don't know how big the files you want to upload are, but for large files, a "pre-signed post" allows the user to control the browser to bypass your server and upload directly to S3. This might be what you need - to free up your server at boot time.



0


source







All Articles