Django, Celery, and Flower: Creating and Managing Asynchronous Tasks

Sept. 21, 2020, 12:46 p.m.
Django · 6 min read
Django, Celery, and Flower: Creating and Managing Asynchronous Tasks

Introduction to Asynchronous tasks in Django

If you are reading this, chances are you're familiar with the Django framework.  If not feel free to check out some of our Beginner's Guide to Django Web Apps.  In this post, we discuss how to set up asynchronous background tasks using Celery, Redis, and a monitoring tool called Flower using the Django framework.  Asynchronous tasks signify that multiple tasks are processed in parallel whereas synchronous tasks are only executed one by one.  Celery is a task queue written in Python that allows work to be distributed amongst workers, thus enabling tasks to be executed asynchronously.  However, Celery requires a message broker that acts as an intermediary between the Django application and the Celery task queue.  For this tutorial, we will use Redis as our message broker.  Lastly, to view our workers and see the tasks we complete, we will use Flower.  Below we list a summary of the various features and official websites for each piece of software.  


Celery Features

  • High availability
  • Easy to use and maintain
  • Fast - millions of tasks can be processed in a minuted
  • Supports multiple brokers: RabbitMQ, Redis, Amazon SQS, and more
  • Schedule tasks using the Python datetime module
  • Official Docs:


Redis Features

  • Supports a wide range of data structures: strings, hashes, lists, sets, bitmaps, geospatial indexes, and many more
  • Can be used as a database, cache, and message broker
  • Can store 512 MB in a single key and value
  • Pub/sub messaging system for high-performance messaging 
  • High read and write speed
  • Interactive tutorial to understand how Redis stores data:
  • Official Docs:


Flower Features

  • Real-time monitoring with task progress, start time, arguments, and statistics
  • Control Celery workers remotely - restart, revoke, or terminate worker instances
  • Monitor broker
  • Simple and straightforward dashboard
  • Official Docs:

Flower Preview:

flower dashboard


Django Celery Redis Tutorial:

For this tutorial, we will simply be creating a background task that takes in an argument and prints a string containing the argument when the task is executed.  Of course, background tasks have many other use cases, such as sending emails, converting images to smaller thumbnails, and scheduling periodic tasks.

  • Create a virtual environment, activate the environment, install Django, and set up a project (Note: the following commands are for Windows 10)
  • C:\Users\Owner\Desktop\code>py -m venv celery-demo
    C:\Users\Owner\Desktop\code>cd celery-demo
    Include  Lib  Scripts  pyvenv.cfg
    (celery-demo) C:\Users\Owner\Desktop\code\celery-demo>pip install django
    (celery-demo) C:\Users\Owner\Desktop\code\celery-demo>django-admin startproject mysite
  • Change the directory to mysite and install Celery with the Redis broker.  Obviously, you can use a different broker besides Redis here
  • (celery-demo) C:\Users\Owner\Desktop\code\celery-demo\mysite>pip install "celery[redis]"
  • In the mysite/mysite folder (with create a new file called
  • First, we need to define the Celery instance.  In add the following code:
  • import os
    from celery import Celery
    # set the default Django settings module for the 'celery' program.
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')
    app = Celery('mysite', backend='redis', broker='redis://localhost:6379')
    # Using a string here means the worker doesn't have to serialize
    # the configuration object to child processes.
    # - namespace='CELERY' means all celery-related configuration keys
    #   should have a `CELERY_` prefix.
    app.config_from_object('django.conf:settings', namespace='CELERY')
    # Load task modules from all registered Django app configs.
    def debug_task(self):
        print('Request: {0!r}'.format(self.request))
  • Next, we need Celery to run when Django starts.  In mysite/mysite/ add the following:
  • from __future__ import absolute_import, unicode_literals
    # This will make sure the app is always imported when
    # Django starts so that shared_task will use this app.
    from .celery import app as celery_app
    __all__ = ('celery_app',)
  • In mysite/mysite/ add the following:
  • BROKER_URL = 'redis://localhost:6379'
    CELERY_RESULT_BACKEND = 'redis://localhost:6379'
    CELERY_ACCEPT_CONTENT = ['application/json']
        # others app
  • Now we need to add a task.  To do so, we'll create a Django app then create a new file in the app called  The previous function app.autodiscover_tasks() in searches for "" in all the apps we create, in which case, you can create multiple apps that hold different tasks as long as they are stored in a file.
  • (celery-demo) C:\Users\Owner\Desktop\code\celery-demo\mysite>py startapp main
  • Notice all tasks must use the @shared_task tag. Create in the main app and add the following task:
    • #mysite/main/
      from __future__ import absolute_import
      from celery import shared_task
      def test(param):
          return 'The tasks executed with the following parameter: "%s" '
  • Create in the main app:
    • #mysite/main/
      from django.contrib import admin
      from django.urls import path, include
      from . import views
      urlpatterns = [
          path('', views.home ),
  • Add a view to connect the newly added task to a slug.  Notice how we add delay when we call the task to execute the function in the background.
    • #mysite/main/
      from django.shortcuts import render
      from django.http import HttpResponse
      from .tasks import test
      # Create your views here.
      def home(request):
      	return HttpResponse("Hey there!")
  • Time to install Redis:
    • On Windows, since Redis does not have an official release, visit and download the latest release.
      • Extract the zipped file to the mysite project folder. 
      • In command prompt, run redis-server.exe
      • In a new command prompt, go to the project directory and run redis-client.exe
      • Type in "ping" and you should receive a "PONG" response
      • Close command prompt
    • On Mac, feel free to use Homebrew and enter the following commands in the mysite directory
      • brew install redis
      • Run redis-server in terminal
      • In a new terminal, go to the project directory and run redis-client
      • Type in "ping" and you should receive a "PONG" response
  • Keep your Redis server running in one terminal/command prompt, in the other, stop running redis-client and enter the following command to install flower
    • pip install flower
  • Now, make sure to have 4 terminals/command prompts for the following commands:
    • 1) To run the redis server 
    • 2) To run: py/python3 runserver
    • 3) To run celery: celery -A mysite worker -l info --pool=solo
    • 4) To run flower: flower -A mysite
  • After running flower, you will see an address to view the dashboard, in this case, at port 5555
    • (celery-demo) C:\Users\Owner\Desktop\code\celery-demo\mysite>flower -A mysite
      [I 200914 08:03:54 command:140] Visit me at http://localhost:5555
      [I 200914 08:03:54 command:145] Broker: redis://localhost:6379//
    • flower dashboard intro
  • Visit to run your task and then view the Flower dashboard again
    • flower dashboard
  • If the dashboard shows that the task succeeded, then you should be able to view the executed task in the terminal/command prompt that was running celery:
    • [2020-09-21 12:12:23,328: INFO/MainProcess] Connected to redis://localhost:6379//
      [2020-09-21 12:12:23,338: INFO/MainProcess] mingle: searching for neighbors
      [2020-09-21 12:12:24,367: INFO/MainProcess] mingle: all alone
      [2020-09-21 12:12:24,376: WARNING/MainProcess] c:\users\owner\desktop\code\celery-demo\lib\site-packages\celery\fixups\ UserWarning: Using settings.DEBUG leads to a memory
                  leak, never use this setting in production environments!
        leak, never use this setting in production environments!''')
      [2020-09-21 12:12:24,377: INFO/MainProcess] celery@DESKTOP-9ERS2HA ready.
      [2020-09-21 12:12:37,121: INFO/MainProcess] Events of group {task} enabled by remote.
      [2020-09-21 12:12:40,825: INFO/MainProcess] Received task: main.tasks.test[4b88a8e4-9efa-4d7c-82ff-bfffa5969c4c]
      [2020-09-21 12:12:40,829: INFO/MainProcess] Task main.tasks.test[4b88a8e4-9efa-4d7c-82ff-bfffa5969c4c] succeeded in 0.0s: 'The tasks executed with the following parameter:15'


Conclusion: Benefits of Asynchronous Tasks

While this was a rather simple tutorial on scheduling background/asynchronous tasks, there are plenty of opportunities moving forwards.  Think about Twitter or Instagram and how users rarely need to reload a given page.  Instead, these apps "feel alive" as likes, retweets, and shares update nearly in real-time.  Asynchronous tasks allow applications to provide smoother user experiences since users can constantly engage with updated content.



Post a Comment
Join the community

Written By
I enjoy writing about saas, founders, and Django. Let me know if I should write about you!