如何管理Docker环境中redis-server内存消耗的增加?

问题描述 投票:0回答:1

我使用 redis-server 作为 Django 项目中 Docker 堆栈的一部分,该项目使用 Celery Beat 来执行计划任务。在使用 htop 命令监视进程时,我注意到 redis 服务器使用的内存随着时间的推移逐渐增加。记忆力的增加似乎是渐进的、持续的。是否有我应该实施的推荐实践或设置来管理 redis 服务器使用的内存,尤其是在使用 Celery Beat 的环境中?”

Docker 版本 24.0.7
Docker Compose 版本 v2.21.0

本地.yml

  redis:
    image: redis:6
    container_name: scielo_core_local_redis
    ports:
      - "6399:6379"


  celeryworker:
    <<: *django
    image: scielo_core_local_celeryworker
    container_name: scielo_core_local_celeryworker
    depends_on:
      - redis
      - postgres
      - mailhog
    ports: []
    command: /start-celeryworker

  celerybeat:
    <<: *django
    image: scielo_core_local_celerybeat
    container_name: scielo_core_local_celerybeat
    depends_on:
      - redis
      - postgres
      - mailhog
    ports: []
    command: /start-celerybeat

基础.py

# Celery
# ------------------------------------------------------------------------------
if USE_TZ:
    # http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-timezone
    CELERY_TIMEZONE = TIME_ZONE
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-broker_url
CELERY_BROKER_URL = env("CELERY_BROKER_URL")
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backend
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-accept_content
CELERY_ACCEPT_CONTENT = ["json"]
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-task_serializer
CELERY_TASK_SERIALIZER = "json"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_serializer
CELERY_RESULT_SERIALIZER = "json"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERY_TASK_TIME_LIMIT = 5 * 60
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-soft-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERY_TASK_SOFT_TIME_LIMIT = 36000
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#beat-scheduler
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html
DJANGO_CELERY_BEAT_TZ_AWARE = False

# Celery Results
# ------------------------------------------------------------------------------
# https: // django-celery-results.readthedocs.io/en/latest/getting_started.html
CELERY_RESULT_BACKEND = "django-db"
CELERY_CACHE_BACKEND = "django-cache"
CELERY_RESULT_EXTENDED = True

信息记忆

# Memory
used_memory:8538978880
used_memory_human:7.95G
used_memory_rss:6425821184
used_memory_rss_human:5.98G
used_memory_peak:8610299728
used_memory_peak_human:8.02G
used_memory_peak_perc:99.17%
used_memory_overhead:1300368
used_memory_startup:811864
used_memory_dataset:8537678512
used_memory_dataset_perc:99.99%
allocator_allocated:8539119712
allocator_active:8861048832
allocator_resident:8901853184
total_system_memory:16559783936
total_system_memory_human:15.42G
used_memory_lua:32768
used_memory_lua_human:32.00K
used_memory_scripts:296
used_memory_scripts_human:296B
number_of_cached_scripts:1
maxmemory:0
maxmemory_human:0B
maxmemory_policy:noeviction
allocator_frag_ratio:1.04
allocator_frag_bytes:321929120
allocator_rss_ratio:1.00
allocator_rss_bytes:40804352
rss_overhead_ratio:0.72
rss_overhead_bytes:-2476032000
mem_fragmentation_ratio:0.75
mem_fragmentation_bytes:-2113157632
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:487872
mem_aof_buffer:0
mem_allocator:jemalloc-5.1.0
active_defrag_running:0
lazyfree_pending_objects:0
lazyfreed_objects:0

htop

python django docker redis django-celery
1个回答
0
投票

您可以尝试在 Redis 配置中设置 maxmemory 选项:

创建redis.conf

echo "maxmemory 128M" > redis.conf

在 docker compose 中挂载卷:

  redis:
    image: redis:6
    container_name: scielo_core_local_redis
    ports:
      - "6399:6379"
  volumes:
      - ./redis.conf:/usr/local/etc/redis/redis.conf

参考:https://redis.io/docs/get-started/faq/#how-can-i-reduce-redis-overall-memory-usage

如果这有效,您可以配置最大内存以适合您的情况。另一种选择是限制连接数,默认情况下,redis 最多可以处理 10K 连接。在这里阅读更多内容https://redis.io/docs/reference/clients/

© www.soinside.com 2019 - 2024. All rights reserved.