Background Tasks
Run long-running tasks in the background, preventing lag in the request-response cycle. Example tasks included:
- an upload of user photo
- sending of contact form details to admin recipient
Relationship to async
Alternatively (or in tandem), can modify the codebase to make the operation asynchronous. See for instance the implementation of this setup, in relation to long running tasks, in Running Tasks Concurrently in Django Asynchronous Views. Might/should explore this setup once I'm able to grasp the async nuances.
Local Development
Connecting the Tooling
At this point, I won't delay request-response cycle. I just want to demonstrate the interaction between:
- A task defined in
- A separate worker process
- A message broker
For this purpose, I need to make some adjustments to the default boilerplate:
huey
as the running worker process viarun_huey
- another sqlite database, e.g.
huey.db
as the huey message broker
See task decorator
The function below relates to the storage of an image by a caller function.
Because of the @task
decorator, if immediate: False
, the call gets sent to the message broker instead and the function is returned immediately to the caller. This places the task decorated in the job queue to be resolved by run_huey
.
profiles/tasks.py | |
---|---|
This means that the run_huey
must be operational to handle the queued task.
Start worker
There are several background task services, the most prominent of which is likely celery
. Here I'll use huey
and some default settings with a slight modification:
config/bases/local.py | |
---|---|
- Instead of the default
huey_class
:huey.RedisHuey
(which the boilerplate changes tohuey.MemoryHuey
), can usehuey.SqliteHuey
as simple message broker to demonstrate the job being consumed:
Without modification, when attempting to `run_huey`, will result in _huey.exceptions.ConfigurationError: Consumer cannot be run with Huey instances where immediate is enabled._
huey.sqlitehuey vs. redis-server
Instead of using sqlite, can opt for redis-server
running in the background.
See macOS installation instructions. Note this is a global installation on the OS.
Note python manage.py run_huey
creates the following files in the src/
directory:
huey.db
huey.db-shm
huey.db-wal
The huey.db
, as the message broker, will get populated per task queued.
Test service
Change a photo from the settings dashboard and this will result in a new task being created in huey.
From the huey console started above, we'll notice 2 new additional lines:
Inspecting huey.db
, particularly kv
table, note new entry added:
sqlite3 huey.db ".headers on" "select * from kv"
# queue|key|value
# db.sqlite|45a09254-bd89-4ad5-9bf3-efa7cc964cdf|��
Local/Staging Development
Prefatorily, it takes 6-8 seconds before Cloudflare is able to storage a new image associated with a user profile. This means that the user needs to wait for the process to complete before a response can be returned by the view. This makes Cloudflare Images uploads a suitable candidate as a background task.
I'll reproduce the infrastructure described under Local Development to use Cloudflare Images instead of local file storage in saving uploaded image files. This implies using some env variables and getting redis up and running:
ENV_NAME=test # (1)
REDIS_URL=redis://redis:6379/0 # (2)
CF_ACCT_ID=aaa # (3)
CF_IMG_TOKEN=bbb
CF_IMG_HASH=ccc
-
Will enable the app use of Cloudflare instead of the local default
See sample implementation in profiles/utils.py used in profiles/models.py -
Implies redis will be running in the background
- Assumes prior setup of Cloudflare Images
Run 3 services simultaneously:
Like the scenario above, try changing a photo from the settings dashboard.
This should result in a new task being created in huey
.
After the request is sent, a response can immediately be returned and the task of uploading an image to Cloudflare Images, a long-running task, gets handled by a worker process in the background.
Docker/Staging Development
compose
Use compose.yaml
to build a local test environment that can make use of Cloudflare's API. It puts together the services described above so that it's possible to run the services by interconnected containers:
See compose command in detail