Django + Celery Worker + Redis

I am developing a Django project that is currently hosted on Vercel as my API server. I want to implement celery workers to offload long running tasks (GenAI models prompting/outputs) and then sending emails. I have gotten familiar with running Celery locally on my machine and understand that I would have my django server running and then at the same time have my celery worker (connected to redis as a broker on Redis Cloud) running using this function in the terminal:

celery -A tasks worker --loglevel=info 

My question is how do you concurrently both the Django server and the celery worker on Vercel?

Hi, @lukejamestyler! Welcome to the Vercel Community. :smile:

Vercel is designed for serverless deployments, which means:

  1. It doesn’t support long-running processes like Celery workers.
  2. Each function (including your Django views) runs in isolation and terminates after execution.
  3. There’s no persistent server environment where you can keep a Celery worker running.

Instead, you could:

  • Host Celery workers separately on platforms like Heroku or AWS EC2, while keeping your Django API on Vercel.
  • Use serverless-friendly task queues like AWS SQS with Lambda or Google Cloud Tasks instead of Celery.
  • Implement a webhook-based system where Vercel functions trigger tasks on external services.
  • Utilize Vercel’s Cron Jobs for scheduled tasks, though this has limited functionality compared to Celery.
  • If Celery is essential, consider migrating your entire project to a platform that supports both Django and Celery, such as Heroku or AWS Elastic Beanstalk.

Let us know how you get on!

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.