I use Celery to process background jobs that use a blocking API for a Django web app. Since the API can only process one job at a time, I also use a lock (implemented with Redis) to ensure that only one job is running at a time. I have jobs with different priorities, and I wonder if I need to implement the sorting by priority myself, or if celery or Redis already provides a priority queue, which might also be more robust against race conditions.
Best would be if I also had the option for the current job to check if something with a higher priority is in the queue and then decide if it should cancel the current operation and put it back in the queue.
The current code (without priorities and canceling tasks) looks like this:
from celery import shared_task
from django.core.cache import cache
@celery.shared_task
def my_task(data1, data2):
with cache.lock("my_task"):
use_api(data1)
# here the task could cancel itself
use_api(data2)
def some_view(request):
my_task.delay(data1=request.GET.get("data1"),
data2=request.GET.get("data2"))