Celery worker using a lot of memory
chrisswhitneyy
PROOP

2 years ago

Hello, so I'm using the Django, Celery, Redis & Postgres template and when I run it locally the worker is using around 900mb of memory just idealing. However, when I deploy it to Railways it uses around 3gb of memory while idling. I don't know if anyone has any possible insight into this issue. It'd be much appreciated. I've played around with the 'celery max-memory-per-child' setting however that doesn't seem to make a difference.

It must be something with the deployment configuration, start command or something besides the code base itself because it's not using 3gb of memory when running locally.

Thanks in advance.

5 Replies

chrisswhitneyy
PROOP

2 years ago

c9822e71-ad02-4e28-9f58-0fd9f2ed99b0


chrisswhitneyy
PROOP

2 years ago

Here's what I mean by memory usage locally vs. deployed

1221396388882153700
1221396389284679700


brody
EMPLOYEE

2 years ago

you want the concurrency flag instead, I'd say setting it down to 2 initially is a safe bet, and then do some testing, if thats not enough bump it.


chrisswhitneyy
PROOP

2 years ago

Thank you so much! That solved the issue!


brody
EMPLOYEE

2 years ago

Awesome!


Loading...