0

I use Celery to process background jobs that use a blocking API for a Django web app. Since the API can only process one job at a time, I also use a lock (implemented with Redis) to ensure that only one job is running at a time. I have jobs with different priorities, and I wonder if I need to implement the sorting by priority myself, or if celery or Redis already provides a priority queue, which might also be more robust against race conditions.

Best would be if I also had the option for the current job to check if something with a higher priority is in the queue and then decide if it should cancel the current operation and put it back in the queue.

The current code (without priorities and canceling tasks) looks like this:

from celery import shared_task
from django.core.cache import cache


@celery.shared_task
def my_task(data1, data2):
  with cache.lock("my_task"):
    use_api(data1)
    # here the task could cancel itself
    use_api(data2)

def some_view(request):
  my_task.delay(data1=request.GET.get("data1"),
                data2=request.GET.get("data2"))
3
  • 1
    Redis - Priority Queues Commented Apr 11, 2024 at 16:08
  • @TedLyngmo Is there a celery equivalent? I want a queue that uses only a single worker (or at least has only one active task at a time) and orders the tasks by priority, and possibly allows the celery task to query which tasks are waiting, so that it can cancel itself depending on estimates of the remaining runtime and priorities of the waiting tasks. Commented Apr 11, 2024 at 17:16
  • docs.celeryq.dev/en/stable/userguide/… Commented Apr 11, 2024 at 17:50

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.