Celery Scheduling Setup Guide
This guide explains how to set up and use the Celery scheduling system with your Lin application.
Overview
The updated start_app.py
now automatically starts both the Flask application and Celery components (worker and beat scheduler) when you run the application. This ensures that your scheduled tasks will execute properly.
Prerequisites
1. Redis Server
Celery requires Redis as a message broker. Make sure Redis is installed and running:
Windows:
# Install Redis (if not installed)
# Download from https://github.com/microsoftarchive/redis/releases
# Start Redis server
redis-server
Linux/Mac:
# Install Redis
sudo apt-get install redis-server # Ubuntu/Debian
brew install redis # macOS
# Start Redis
sudo systemctl start redis
sudo systemctl enable redis
2. Hugging Face Spaces / Production Deployment
For Hugging Face Spaces or production deployments where you can't run Redis directly:
Option A: Use Redis Cloud (Recommended)
- Create a free Redis Cloud account at https://redislabs.com/try-free/
- Create a Redis database (free tier available)
- Update your
.env
file:
CELERY_BROKER_URL="redis://your-redis-host:port/0"
CELERY_RESULT_BACKEND="redis://your-redis-host:port/0"
Option B: Use Docker Compose
# docker-compose.yml
version: '3.8'
services:
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis_data:/data
command: redis-server --appendonly yes
volumes:
redis_data:
Option C: Skip Celery (Basic Mode) If Redis is not available, the Flask app will start without Celery functionality. Schedules will be saved but won't execute automatically.
2. Python Dependencies
Install the required packages:
pip install -r backend/requirements.txt
Starting the Application
Using start_app.py (Recommended)
python start_app.py
This will:
- Check Redis connection
- Start Celery worker in the background
- Start Celery beat scheduler in the background
- Start the Flask application
Using Backend Scripts (Alternative)
# Start both worker and beat
cd backend
python start_celery.py all
# Or start individually
python start_celery.py worker # Start Celery worker
python start_celery.py beat # Start Celery beat scheduler
Configuration
Environment Variables
Make sure these are set in your .env
file:
# Supabase configuration
SUPABASE_URL="your_supabase_url"
SUPABASE_KEY="your_supabase_key"
# Redis configuration (if not using defaults)
CELERY_BROKER_URL="redis://localhost:6379/0"
CELERY_RESULT_BACKEND="redis://localhost:6379/0"
# Scheduler configuration
SCHEDULER_ENABLED=True
Celery Configuration
The unified configuration is in backend/celery_config.py
:
celery_app.conf.beat_schedule = {
'load-schedules': {
'task': 'backend.celery_tasks.schedule_loader.load_schedules_task',
'schedule': crontab(minute='*/5'), # Every 5 minutes
},
}
How Scheduling Works
1. Schedule Loading
- Immediate Updates: When you create or delete a schedule via the API, Celery Beat is updated immediately
- Periodic Updates: Celery Beat also runs every 5 minutes as a backup
- Executes
load_schedules_task
- Fetches schedules from Supabase database
- Creates individual periodic tasks for each schedule
2. Task Execution
- Content Generation: Runs 5 minutes before scheduled time
- Post Publishing: Runs at the scheduled time
- Tasks are queued in appropriate queues (content, publish)
3. Database Integration
- Uses Supabase for schedule storage
- Automatically creates tasks based on schedule data
- Handles social network authentication
Monitoring and Debugging
Checking Celery Status
# Check worker status
celery -A celery_config inspect stats
# Check scheduled tasks
celery -A celery_config inspect scheduled
# Check active tasks
celery -A celery_config inspect active
Viewing Logs
- Flask Application: Check console output
- Celery Worker: Look for worker process logs
- Celery Beat: Look for beat process logs
Common Issues
1. Redis Connection Failed
Error: Error 111 connecting to localhost:6379. Connection refused
Solutions:
1. Start Redis server locally (development):
Windows: redis-server
Linux/Mac: sudo systemctl start redis
2. Use Redis Cloud (production/Hugging Face):
- Create free Redis Cloud account
- Update .env with Redis Cloud URL
- Set CELERY_BROKER_URL and CELERY_RESULT_BACKEND
3. Use Docker Compose:
version: '3.8'
services:
redis:
image: redis:7-alpine
ports: ["6379:6379"]
volumes: [redis_data:/data]
volumes: redis_data:
4. Skip Celery (basic mode):
- App will start without scheduling functionality
- Schedules saved but won't execute automatically
2. Tasks Not Executing
# Check if Celery worker is running
celery -A celery_config inspect ping
# Check if beat scheduler is running
ps aux | grep celery
3. Schedule Not Loading
- Check Supabase database connection
- Verify schedule data in database
- Check task registration in Celery
Testing the Scheduling System
Manual Testing
# Test schedule loading task
from backend.celery_tasks.schedule_loader import load_schedules_task
result = load_schedules_task()
print(result)
API Testing (Recommended)
Create a schedule via the API:
curl -X POST http://localhost:5000/api/schedules/ \ -H "Content-Type: application/json" \ -H "Authorization: Bearer YOUR_JWT_TOKEN" \ -d '{ "social_network": "1", "schedule_time": "09:00", "days": ["Monday", "Wednesday", "Friday"] }'
Check the response: You should see a
celery_update_task_id
field indicating the scheduler was updated immediatelyVerify in Celery: Check if the individual tasks were created:
celery -A celery_config inspect scheduled
Database Testing
- Add a schedule directly in the Supabase database
- Wait 5 minutes for the loader task to run (or trigger via API)
- Check if individual tasks were created
- Verify task execution times
Production Deployment
Using Docker (Recommended for Hugging Face Spaces)
# Build the Docker image
docker build -t lin-app .
# Run the container
docker run -p 7860:7860 lin-app
# For Hugging Face Spaces deployment:
# 1. Update your Dockerfile (already done above)
# 2. Push to Hugging Face Spaces
# 3. The container will automatically start Redis + your app
Using Docker Compose (Local Development)
# docker-compose.yml
version: '3.8'
services:
redis:
image: redis:7-alpine
ports: ["6379:6379"]
volumes: [redis_data:/data]
command: redis-server --appendonly yes
app:
build: .
ports: ["7860:7860"]
depends_on: [redis]
environment:
- REDIS_URL=redis://redis:6379/0
volumes: [./:/app]
volumes: redis_data:
# Start all services
docker-compose up -d
# Check logs
docker-compose logs -f
Using Docker
# Build and start all services
docker-compose up -d
# Check logs
docker-compose logs -f
Using Supervisor (Linux)
Create /etc/supervisor/conf.d/lin.conf
:
[program:lin_worker]
command=python start_app.py
directory=/path/to/lin
autostart=true
autorestart=true
user=www-data
environment=PATH="/path/to/venv/bin"
[program:lin_beat]
command=python -m celery -A celery_config beat
directory=/path/to/lin
autostart=true
autorestart=true
user=www-data
environment=PATH="/path/to/venv/bin"
Troubleshooting Checklist
- β Redis server is running
- β All Python dependencies are installed
- β Environment variables are set correctly
- β Supabase database connection works
- β Celery worker is running
- β Celery beat scheduler is running
- β Schedule data exists in database
- β Tasks are properly registered
- β Task execution permissions are correct
Support
If you encounter issues:
- Check this guide first
- Review the logs for error messages
- Verify all prerequisites are met
- Test components individually
- Check the Celery documentation
For additional help, refer to the Celery documentation at: https://docs.celeryq.dev/