Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.sluice.sh/llms.txt

Use this file to discover all available pages before exploring further.

By the end of this guide, your Celery jobs will appear in the Sluice dashboard — with real-time state tracking, search, and management actions.

Prerequisites

  • Python 3.11 or later
  • Celery 5.3 or later with a Redis broker
  • A Sluice account (free, no credit card required)

1. Create a connection

Sign in to the Sluice dashboard and go to Connections. Click Add Connection, pick “Python SDK”, and name your connection (e.g., “Production Celery”). You’ll receive:
  • An API key — starts with sk_
  • A Connection ID — a UUID that identifies this connection
Copy both values.

2. Install the SDK

pip install sluice-celery

3. Initialize Sluice

Add an import and an init() call to your Celery configuration — wherever you define your Celery app:
import sluice

sluice.init(
    api_key="sk_live_...",
    connection_id="550e8400-e29b-41d4-a716-446655440000",
)

# Your existing Celery config below
broker_url = "redis://localhost:6379/0"
Use environment variables in production. Set SLUICE_API_KEY and SLUICE_CONNECTION_ID as env vars, then call sluice.init() with no arguments. This keeps secrets out of your codebase.

4. Restart your worker

celery -A your_app worker -l info
You should see Sluice confirm the connection in the worker logs:
[sluice] Django project detected.
[sluice] SDK initialized successfully.
[sluice] Celery worker 'celery@your-hostname' configured for monitoring.

5. See your data

Open the Sluice dashboard. Send a task from your application and watch it appear in real time. That’s it — you’re monitoring Celery.

What the SDK does automatically

When sluice.init() runs, the SDK:
  1. Auto-configures Celery events — sets worker_send_task_events=True, task_send_sent_event=True, and task_track_started=True. These are all False by default in Celery, and without them, most monitoring data is invisible.
  2. Installs a Celery Bootstep — hooks into the worker lifecycle to capture task events as they happen.
  3. Normalizes and forwards events — batches events and sends them to the Sluice API over HTTPS.
  4. Never crashes your worker — all runtime errors are caught and logged. The SDK operates under a strict “production cannot fail” philosophy.
The SDK auto-detects whether you’re running Django or standalone Celery and configures itself accordingly. No framework-specific setup is needed.

Next steps