Amazon Simple Storage Service or Amazon S3 is a storage service with a web API. I use Amazon S3 to store backups of my blog and other sites. I made a simple python script to handle file uploads to S3.
In order to use Amazon S3, first create a bucket using your Amazon AWS account. As the name suggests, bucket is a container. You can create buckets using the AWS management console.
The script we're going to write will take two input parameters
- Path to file, local
- Target S3 path
Amazon S3 buckets don't support hierarchical directories. To simulate such a file system, you can use '/' in the target file name.
Import the libraries:
#!/usr/bin/env python import os import sys from boto.s3.connection import S3Connection from boto.s3.key import Key
Read the command line arguments:
local_file_path = sys.argv s3_path = sys.argv
Using AWS Identity and Access Management (IAM) service, create an account which you can use from scripts. Make a note of the key and secret of the newly created account.
I have named my bucket 'j_bucket'.
Initialize the connection:
conn = S3Connection('your account key', 'your account secret') pb = conn.get_bucket('j_backups')
Now, we're ready to create an S3 key. Once the key object is created, wet set the contents of the key and also the name of the key.
k = Key(pb) file_name_to_use_in_s3 = "%s/%s"%(s3_path, os.path.basename(local_file_path)) k.name = file_name_to_use_in_s3 k.set_contents_from_filename(local_file_path) sys.exit(0)
Save the file as send-to-s3.py and set execute permissions. We can now use the script like:
./send-to-s3.py path/to/my/local/file "mystore/mybackups"
In this example. the file "path/to/my/local/file" will be transferred to S3 bucket with the name "mystore/mybackups".
The script is also available on github.