I have project on Django, DRF, SQLite, RabbitMQ and Celery.
When I run the project locally (separately rabbitmq, celery and django), everything works correctly. But when I run it in Docker, the task in celery does not find an object in the database. Attempts to find the created task from the celery container shell are also unsuccessful (I only see old objects). If I enable CELERY_TASK_ALWAYS_EAGER=True
setting, then everything works fine in Docker.
settings.py
DATABASES = {"default": {"ENGINE": "django.db.backends.sqlite3","NAME": BASE_DIR / "db.sqlite3", }}
docker-compose.yaml
version: "3.7"services: rabbitmq: restart: always container_name: "rabbitmq" image: rabbitmq:3-management-alpine ports: - 5672:5672 - 15672:15672 app: restart: always container_name: "fbrq_api" build: . volumes: - .:/code command: python3 manage.py runserver 0.0.0.0:8000 ports: - "8000:8000" celery: restart: always container_name: "celery" build: context: . command: celery -A fbrq_api worker -l info env_file: - ./.env depends_on: - app - rabbitmq environment: - CELERY_BROKER_URL=amqp://guest:guest@rabbitmq:5672/
celery.py
import osfrom celery import Celeryos.environ.setdefault("DJANGO_SETTINGS_MODULE", "fbrq_api.settings")app = Celery("fbrq_api", broker="amqp://guest:guest@rabbitmq:5672/")app.config_from_object("django.conf:settings", namespace="CELERY")app.autodiscover_tasks()@app.task(bind=True, ignore_result=True)def debug_task(self): print(f"Request: {self.request!r}")
I understand that SQLite is not the best choice, but I think it's not about the database itself, but about the connection between Celery and the database.
UPDATE
An exception occurs here:
@shared_task(bind=True, acks_late=True)def send_mailing(self, mailing_id): try: mailing = Mailing.objects.get(id=mailing_id) ... except Mailing.DoesNotExist: logger.warning(f"Mailing {mailing_id} doesn't exist")