Set metadata when using s3cmd to load static website in Amazon S3

I am creating a static webpage from markdown using Pelican. I then dump the output to S3 using s3cmd:

s3cmd sync -rr --delete-removed output/ s3://mybucket/

      

Unfortunately, the metadata for css files is not set correctly, I have to set it manually for "text / css". Is there a way to get s3cmd to set these (and possibly other file types) to the correct value? Or if not, is there an alternative to s3cmd that works on Linux and MacOS X?

+3


source to share


1 answer


You can set the mime type of the file when uploading to s3 with the -m switch

s3cmd -m text/css ./file.css s3:/path/to/bucket

      

Apart from what I do, first compress it with gzip, then upload to s3, but you also need to set the encoding of the content. Note that Chrome and Safari sometimes don't like the .gz extension, so I use .jgz.



gzip file.css -9 -c > file.css.jgz
s3cmd -m text/css ./file.css s3:/path/to/bucket
s3cmd -m text/css --add-header "Content-Encoding:gzip" ./file.css.jgz s3:/path/to/bucket

      

Also note that in the html file that links to this css you have to make sure the browser can handle gzip. Here's an example in php:

<?php
  $css = "$PATH_TO_S3_BUCKET/file.css";
  if (strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) 
      $css .= '.jgz'; 

  echo "<link rel='stylesheet' type='text/css' href='$css'/>";
?>

      

+4


source







All Articles