Celery Worker 在 Docker 中运行时不执行任何写入操作

问题描述 投票:0回答:1

我不知道如何描述问题,但基本上,我有一个 celery 任务,它可以完成一些工作并将对象写入数据库。

from app.celery import celery_app
from some.models import Thing
import requests

@celery_app.task
def my_task_1():
    response = requests.get("https://example.com/response1.json")
    things = []

    for data in response.json()['records']:
        thing, _ = Thing.objects.get_or_create(
            name=data['name'],
            data=data['data'],
        )
        things.append(thing)
    # ... snip ...

@celery_app.task
def my_task_2():
    response = requests.get("https://example.com/response2.json")
    things = []

    for data in response.json()['records']:
        thing, _ = Thing.objects.get_or_create(
            name=data['name'],
            data=data['data'],
        )
        things.append(thing)
    # ... snip ...

仅使用一名工作人员使用

django-celery-beat
安排任务。上面的代码在我的开发机器上完美运行(或者我认为如此)。然而,当我使用 docker 部署项目时,我遇到了一些有线问题。

1-我可以看到

Thing
对象正在被创建。但无法访问?
2- 新创建的
Thing
对象在
django-admin
中不可见。
3-一旦我重新启动我的工作人员,对象将可见。

我的 docker-compose 文件如下所示:

version: '3'

services:
  db:
    image: postgres:latest
    restart: always
    environment:
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: postgres
    volumes:
      - db_data:/var/lib/postgresql/data
  redis:
    image: redis:latest
    restart: always

  web:
    build: .
    restart: always
    command: /webserver-entrypoint.sh
    environment:
      POSTGRES_HOST: db
      CELERY_BROKER_URL: 'redis://redis:6379/0'
      CELERY_RESULT_BACKEND: 'redis://redis:6379/0'
    ports:
      - "8000:8000"
    depends_on:
      - db

  worker:
    build: .
    restart: always
    command: celery -A asmcore.server worker -B -P threads -l info
    depends_on:
      - db
    environment:
      CELERY_BROKER_URL: 'redis://redis:6379/0'
      CELERY_RESULT_BACKEND: 'redis://redis:6379/0'
      POSTGRES_HOST: db

volumes:
  db_data:

环境:

python = "^3.11"
django = "^5.0.3"
elasticsearch = "^8.13.0"
celery = {extras = ["redis"], version = "^5.3.6"}
djangorestframework = "^3.15.1"
django-unfold = "^0.21.1"
django-celery-beat = "^2.6.0"
django-import-export = "^3.3.7"
django-guardian = "^2.4.0"
django-simple-history = "^3.5.0"
requests = "^2.31.0"
django-debug-toolbar = "^4.3.0"
django-taggit = "^5.0.1"
markdown = "^3.6"
django-filter = "^24.2"
django-cachalot = "^2.6.2"
gunicorn = "^21.2.0"
psycopg = "^3.1.18"

我尝试过的:
1-我尝试用

transaction.atomic

包装我的任务 2-我尝试使用 docker 镜像
init
容器,例如
tini
s6-overlay

python django docker celery django-celery
1个回答
0
投票

看起来您需要在 docker-compose 文件中定义 redis 容器的端口。这样其他容器就可以与其通信。

  redis:
    image: redis:latest
    restart: always
    ports:
      - '127.0.0.1:6379:6379'

可能还会发生更多事情,但这是我会尝试的一件事。

© www.soinside.com 2019 - 2024. All rights reserved.