Ultra Short Guide to Using Amazon S3 with Django

Dan Kaufhold
Bitlab Studio
Published in
3 min readNov 23, 2016

--

Django and S3 have been a staple of Bitlab Studio’s stack for a long time. Here I want to show you how to put those two together.

Since you probably searched for this specifically on Google, I will assume, that you’re familiar with Django and already know, what S3 is and want to get right into the action. Well here we go!

Dependencies

First you need to add the latest versions of django-storages and boto3 to your requirements (might need to bump up the versions in the future):

django-storages==1.5.1
boto3==1.4.1

AWS Setup

First you need to create an AWS account.

Option 1 — using user credentials

You will need to get or create your user’s security credentials from AWS IAM (Identity and Access Management):

  • AWS Access Key ID
  • AWS Secret Access Key ID

Then you create a new S3 bucket from the AWS S3 console in the availability zone you prefer.

Note, that you should never ever use your root account credentials for this!
For extra information consult the IAM best practices.

Option 2 — using service role [recommended]

For this you will need to add a service role to the EC2 instance while launching the instance. It’s on step 3 of the launch wizard.

Name the role something like myproject-instance-role, select Amazon EC2 under AWS Service Roles and finally give it the AmazonS3FullAccess permission.

When using a role, you need to set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY settings to None.
Boto will then automatically assume the instance role when accessing your bucket.

Settings

# add the credentials from IAM and bucket name
AWS_STORAGE_BUCKET_NAME = 'mybucketname' # or None if using service role
AWS_ACCESS_KEY_ID = 'MYACCESSKEYID' # or None if using service role
AWS_SECRET_ACCESS_KEY = 'mysecretaccesskey12345'
# if False it will create unique file names for every uploaded file
AWS_S3_FILE_OVERWRITE = False
# the url, that your media and static files will be available at
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
# the sub-directories of media and static files
STATICFILES_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
# a custom storage file, so we can easily put static and media in one bucket
STATICFILES_STORAGE = 'myproject.custom_storages.StaticStorage'
DEFAULT_FILE_STORAGE = 'myproject.custom_storages.MediaStorage'
# the regular Django file settings but with the custom S3 URLs
STATIC_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, STATICFILES_LOCATION)
MEDIA_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, MEDIAFILES_LOCATION)

Custom storage

Create a file called custom_storages.py where you specified in the settings above (in myproject) and add the following code:

from django.conf import settings
from storages.backends.s3boto3 import S3Boto3Storage
class StaticStorage(S3Boto3Storage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3Boto3Storage):
location = settings.MEDIAFILES_LOCATION

That’s it

Now you can run python manage.py collectstatic and it will copy the static files into the static sub-directory of your S3 bucket.

File uploads will land in your media sub-directory.

[OPTIONAL] Copy existing media files

Install AWS CLI globally on your server:

$ sudo apt-get install awscli
$ aws configure
# enter credentials

Copy the files from an existing media directory

$ aws s3 cp /path/to/my/media s3://mybucketname/media --recursive --acl public-read-write

The end

Thanks for checking out this post. Please make sure to like and share it, if it was of any help for you. That would help us a lot in return.

Please leave a comment, if this somehow doesn’t work for you and we’ll try our best to keep this post updated.

Want to know more about Bitlab Studio? Visit bitlabstudio.com!

Cheers!

--

--