This document describes the current stable version of Celery (3.1). For development docs, go here.
First steps with Django¶
Using Celery with Django¶
Note
Previous versions of Celery required a separate library to work with Django, but since 3.1 this is no longer the case. Django is supported out of the box now so this document only contains a basic way to integrate Celery and Django. You will use the same API as non-Django users so it’s recommended that you read the First Steps with Celery tutorial first and come back to this tutorial. When you have a working example you can continue to the Next Steps guide.
To use Celery with your Django project you must first define an instance of the Celery library (called an “app”)
If you have a modern Django project layout like:
- proj/
- proj/__init__.py
- proj/settings.py
- proj/urls.py
- manage.py
then the recommended way is to create a new proj/proj/celery.py module that defines the Celery instance:
file: | proj/proj/celery.py |
---|
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
from django.conf import settings # noqa
app = Celery('proj')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Then you need to import this app in your proj/proj/__init__.py
module. This ensures that the app is loaded when Django starts
so that the @shared_task
decorator (mentioned later) will use it:
proj/proj/__init__.py
:
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app # noqa
Note that this example project layout is suitable for larger projects, for simple projects you may use a single contained module that defines both the app and tasks, like in the First Steps with Celery tutorial.
Let’s break down what happens in the first module,
first we import absolute imports from the future, so that our
celery.py
module will not clash with the library:
from __future__ import absolute_import
Then we set the default DJANGO_SETTINGS_MODULE
for the celery command-line program:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
Specifying the settings here means the celery
command line program
will know where your Django project is. This statement must always appear before
the app instance is created, which is what we do next:
app = Celery('proj')
This is your instance of the library, you can have many instances but there’s probably no reason for that when using Django.
We also add the Django settings module as a configuration source for Celery. This means that you don’t have to use multiple configuration files, and instead configure Celery directly from the Django settings.
You can pass the object directly here, but using a string is better since then the worker doesn’t have to serialize the object when using Windows or execv:
app.config_from_object('django.conf:settings')
Next, a common practice for reusable apps is to define all tasks
in a separate tasks.py
module, and Celery does have a way to
autodiscover these modules:
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
With the line above Celery will automatically discover tasks in reusable
apps if you follow the tasks.py
convention:
- app1/
- app1/tasks.py
- app1/models.py
- app2/
- app2/tasks.py
- app2/models.py
This way you do not have to manually add the individual modules
to the CELERY_IMPORTS
setting. The lambda
so that the
autodiscovery can happen only when needed, and so that importing your
module will not evaluate the Django settings object.
Finally, the debug_task
example is a task that dumps
its own request information. This is using the new bind=True
task option
introduced in Celery 3.1 to easily refer to the current task instance.
Starting the worker process¶
In a production environment you will want to run the worker in the background
as a daemon - see Running the worker as a daemon - but for testing and
development it is useful to be able to start a worker instance by using the
celery worker
manage command, much as you would use Django’s runserver:
$ celery -A proj worker -l info
For a complete listing of the command-line options available, use the help command:
$ celery help
Where to go from here¶
If you want to learn more you should continue to the Next Steps tutorial, and after that you can study the User Guide.