December 19, 2011
by Tobias McNulty
0 comments
Categories:
Technical

Using Django and Celery with Amazon SQS

Amazon's Simple Queue Service (SQS) is a relatively new offering in the family of Amazon Web Services (AWS). It's also an appealing one, because it proposes to quickly and easily replace a common component of the stack in a typical web application, thereby obviating the need to run a separate queue server like RabbitMQ. While RabbitMQ — the typical favorite for Celery users — is not necessarily difficult to install or maintain, removing it from the stack of a web application means one less component that might fail, offloading that service to AWS — especially for applications with a small to moderate queue volume — might prove financially advantageous.

While it's quite easy to use Celery with Amazon's Simple Queue Service (SQS), there's currently not a lot of information out there about how to do it. There's this post on the celery-users list that didn't leave me with much hope, and this question on StackOverflow that sounded slightly more promising. I still couldn't find a step-by-step how to, however, and it ended up being quite easy, so here's my take:

  1. Upgrade to the latest versions of kombu, celery, and django-celery. At the time of this writing, those versions are 1.5.1, 2.4.5, and 2.4.2.:

    pip install kombu==1.5.1
    pip install celery==2.4.5
    pip install django-celery==2.4.2
    
  2. Add the following lines to settings.py (or local_settings.py depending on your setup):

    BROKER_TRANSPORT = 'sqs'
    BROKER_TRANSPORT_OPTIONS = {
        'region': 'us-east-1',
    }
    BROKER_USER = AWS_ACCESS_KEY_ID
    BROKER_PASSWORD = AWS_SECRET_ACCESS_KEY
    

    In the above, AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY should point to the appropriate AWS access key and secret for account you want to use. Pro tip: Use AWS's Identity and Access Management (IAM) to setup an API key and secret that only has access to the services your web application will use (typically one or more of SQS, SES, and SimpleDB).

  3. Finally, if you'll be running multiple servers or environments on the same AWS account (e.g., two different web apps or staging and production environments of the same app), you may want to customize the SQS queue name being used (the default is "celery"). To make this change, add the following lines to your settings.py (or again, local_settings.py):

    CELERY_DEFAULT_QUEUE = 'celery-myapp-production'
    CELERY_QUEUES = {
        CELERY_DEFAULT_QUEUE: {
            'exchange': CELERY_DEFAULT_QUEUE,
            'binding_key': CELERY_DEFAULT_QUEUE,
        }
    }
    

For the curious, Celery's support for SQS lies in the underlying Kombu library, the latest version of which includes a transport for SQS. While some points I found (including the StackOverflow post) suggest using the BROKER_URL syntax for pointing to AWS, I found it simpler to use the BROKER_USER and BROKER_PASSWORD variables. I also saw some reports that slashes in your API secret could confuse the underlying URL parser, and since my API secret happened to include a number of slashes, I went straight to using BROKER_USER and BROKER_PASSWORD.

Anyways, I hope this helps someone else looking to solve the same problem, and don't hesitate to comment if you run into any issues or have a better way to go about this!

blog comments powered by Disqus