I am using the redis-server as part of a Docker stack in a Django project that uses Celery Beat for scheduled tasks. While monitoring the processes with the htop command, I noticed that the memory used by the redis-server progressively increases over time. The increase in memory seems to be gradual and continuous. Are there recommended practices or settings that I should implement to manage the memory used by the redis-server, especially in an environment with Celery Beat?"
Docker version 24.0.7
Docker Compose version v2.21.0
local.yml
redis: image: redis:6 container_name: scielo_core_local_redis ports: - "6399:6379" celeryworker:<<: *django image: scielo_core_local_celeryworker container_name: scielo_core_local_celeryworker depends_on: - redis - postgres - mailhog ports: [] command: /start-celeryworker celerybeat:<<: *django image: scielo_core_local_celerybeat container_name: scielo_core_local_celerybeat depends_on: - redis - postgres - mailhog ports: [] command: /start-celerybeat
base.py
# Celery# ------------------------------------------------------------------------------if USE_TZ: # http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-timezone CELERY_TIMEZONE = TIME_ZONE# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-broker_urlCELERY_BROKER_URL = env("CELERY_BROKER_URL")# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backendCELERY_RESULT_BACKEND = CELERY_BROKER_URL# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-accept_contentCELERY_ACCEPT_CONTENT = ["json"]# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-task_serializerCELERY_TASK_SERIALIZER = "json"# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_serializerCELERY_RESULT_SERIALIZER = "json"# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-time-limit# TODO: set to whatever value is adequate in your circumstancesCELERY_TASK_TIME_LIMIT = 5 * 60# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-soft-time-limit# TODO: set to whatever value is adequate in your circumstancesCELERY_TASK_SOFT_TIME_LIMIT = 36000# http://docs.celeryproject.org/en/latest/userguide/configuration.html#beat-schedulerCELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"# http://docs.celeryproject.org/en/latest/userguide/configuration.htmlDJANGO_CELERY_BEAT_TZ_AWARE = False# Celery Results# ------------------------------------------------------------------------------# https: // django-celery-results.readthedocs.io/en/latest/getting_started.htmlCELERY_RESULT_BACKEND = "django-db"CELERY_CACHE_BACKEND = "django-cache"CELERY_RESULT_EXTENDED = True
INFO MEMORY
# Memoryused_memory:8538978880used_memory_human:7.95Gused_memory_rss:6425821184used_memory_rss_human:5.98Gused_memory_peak:8610299728used_memory_peak_human:8.02Gused_memory_peak_perc:99.17%used_memory_overhead:1300368used_memory_startup:811864used_memory_dataset:8537678512used_memory_dataset_perc:99.99%allocator_allocated:8539119712allocator_active:8861048832allocator_resident:8901853184total_system_memory:16559783936total_system_memory_human:15.42Gused_memory_lua:32768used_memory_lua_human:32.00Kused_memory_scripts:296used_memory_scripts_human:296Bnumber_of_cached_scripts:1maxmemory:0maxmemory_human:0Bmaxmemory_policy:noevictionallocator_frag_ratio:1.04allocator_frag_bytes:321929120allocator_rss_ratio:1.00allocator_rss_bytes:40804352rss_overhead_ratio:0.72rss_overhead_bytes:-2476032000mem_fragmentation_ratio:0.75mem_fragmentation_bytes:-2113157632mem_not_counted_for_evict:0mem_replication_backlog:0mem_clients_slaves:0mem_clients_normal:487872mem_aof_buffer:0mem_allocator:jemalloc-5.1.0active_defrag_running:0lazyfree_pending_objects:0lazyfreed_objects:0