Amazon S3 upload script
Thursday, October 9, 2008
I want to share with you a script I've been hacking on to quickly and easily upload files to Amazon's S3 service. upload_to_s3.py takes the name of the bucket as a parameter, and a list of files to upload from standard in. It is intended to run in the background, and communicates to the user by libnotify/pynotify. When it's done uploading, the clipboard is set with the url(s) to the file(s).
Please note that before using upload_to_s3.py, you must enter your S3 access key and secret access key in the script.
Example usage:
cd /directory/with/media/files/The script requires Amazon's S3 Library for REST in Python. As of version 2007-11-05, I had to follow user uuorld's suggestion to modify line 295:
find | grep -v ".svn" | python /path/to/update_s3.py
"I found that changing the lineBig thanks go out to Adrian Holovaty for his script that I hacked on quite a bit. See his blog post here.
headers['Date'] = time.strftime("%a, %d %b %Y %X GMT", time.gmtime())
to
headers['Date'] = time.strftime("%a, %d %b %Y %H:%M:%S GMT", time.gmtime())
resolved some issues I was having with different locales using 24 or 12-hour clocks."
As always, please feel free to email me comments or suggestions about the code.
I'm using this script to incorporate right click uploading into Thunar. Here's how I'm doing it:
* Edit -> Configure Custom Action - > New
* Name: "Upload to S3 random"
* Command: `echo %f | python "/path/to/upload_to_s3.py" bucket` (You need to change the path and the bucket. Do not include the backticks.)
* Icon -> Action Icons -> gtk-go-up
* Appearance conditions -> All except directories
Unfortunately, I don't believe you can use a custom action to upload multiple files using this script because Thunar space delimits the paths. When your paths have spaces, you can no longer separate each file.
Enjoy!
UPDATE:
It seems S3 has become more strict about the headers required in file upload, so this script needs a minor patch to add the Content-Length header. In the conn.put call, add
'Content-Length': len(filedata)
to the data dictionary.
0 comments:
Subscribe to:
Post Comments (Atom)