Tech Chorus

Writing A Python Script To Send Files To Amazon S3

written by Sudheer Satyanarayana on 2011-09-25

Amazon Simple Storage Service or Amazon S3 is a storage service with a web API. I use Amazon S3 to store backups of my blog and other sites. I made a simple python script to handle file uploads to S3.

In order to use Amazon S3, first create a bucket using your Amazon AWS account. As the name suggests, bucket is a container. You can create buckets using the AWS management console.

The script we're going to write will take two input parameters

Path to file, local Target S3 path Amazon S3 buckets don't support hierarchical directories. To simulate such a file system, you can use '/' in the target file name.

To use AWS from Python, boto is the de facto recommended library. You can install boto from your distribution package management system or download from repository.

Import the libraries:

#!/usr/bin/env python

import os
import sys

from boto.s3.connection import S3Connection
from boto.s3.key import Key
Read the command line arguments:

local_file_path = sys.argv[1]
s3_path = sys.argv[2]

Using AWS Identity and Access Management (IAM) service, create an account which you can use from scripts. Make a note of the key and secret of the newly created account.

I have named my bucket 'j_bucket'.

Initialize the connection:

conn = S3Connection('your account key', 'your account secret')
pb = conn.get_bucket('j_backups')

Now, we're ready to create an S3 key. Once the key object is created, wet set the contents of the key and also the name of the key.

k = Key(pb)
file_name_to_use_in_s3 = "%s/%s"%(s3_path, os.path.basename(local_file_path)) = file_name_to_use_in_s3

Save the file as and set execute permissions. We can now use the script like:

./ path/to/my/local/file "mystore/mybackups"

In this example. the file "path/to/my/local/file" will be transferred to S3 bucket with the name "mystore/mybackups".

The script is also available on github.