Language: Python
CLI/Utils
Celery was created by Ask Solem and released in 2009. It was designed to provide a simple and reliable framework to run background tasks in Python applications, supporting distributed processing across multiple workers, queues, and brokers.
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation but supports scheduling as well.
pip install celeryconda install -c conda-forge celeryCelery allows you to define tasks in Python functions, which can be executed asynchronously or scheduled periodically. It supports multiple brokers like RabbitMQ, Redis, and Amazon SQS, and provides tools for monitoring and managing task execution.
from celery import Celery
app = Celery('tasks', broker='redis://localhost:6379/0')
@app.task
def add(x, y):
return x + yDefines a Celery app with Redis as the broker and a simple `add` task that can be executed asynchronously.
result = add.delay(4, 6)
print(result.get(timeout=10))Calls the `add` task asynchronously using `delay()` and retrieves the result with a timeout.
from celery.schedules import crontab
app.conf.beat_schedule = {
'add-every-minute': {
'task': 'tasks.add',
'schedule': crontab(minute='*'),
'args': (2, 3),
},
}Schedules the `add` task to run every minute using Celery Beat.
from celery import chain
result = chain(add.s(2,3), add.s(4))().get()Chains tasks together so that the output of one task is used as the input to the next.
app = Celery('tasks', broker=['redis://localhost:6379/0', 'amqp://guest@localhost//'])Configures Celery to use multiple brokers for redundancy or load balancing.
Use Redis or RabbitMQ as a reliable broker for production environments.
Keep tasks idempotent to allow retries safely.
Use separate queues for different priorities or types of tasks.
Monitor task execution using Flower or Celery events.
Avoid long-running tasks in synchronous workflows; delegate them to Celery workers.