Celery worker container dies while Starting Pool

sjpriest
HOBBY

a year ago

Project ID: 1639f59c-5c44-42c3-9a46-6b7566be82f3

Running a Django application with Postgres + Redis… setup has been fine so far, a few silly misconfigurations on my part, but I'm a little stumped on this one since there's very little data to suggest what might be going wrong.

I'm attempting to deploy the Celery worker as a separate service based off the same repo that holds the Django project. I looked briefly over one of the templates and didn't see anything drastically different from my configuration, or perhaps I'm just not experienced enough to identify it. Here's some of the logs prior to failure:

-------------- celery@dce1163c1a98 v5.4.0 (opalescent)

--- * -----

-- * ---- Linux-6.1.0-9-cloud-amd64-x86_64-with-glibc2.39 2024-07-26 15:14:01

  • *** --- * ---

  • ** ---------- [config]

  • ** ---------- .> app: akashi:0x7fd6d473ef50

  • ** ---------- .> transport: redis://default:**@redis.railway.internal:6379/0

  • ** ---------- .> results: redis://default:**@redis.railway.internal:6379/0

  • *** --- * --- .> concurrency: 32 (prefork)

-- * ---- .> task events: ON

--- * -----

-------------- [queues]

            .> celery           exchange=celery(direct) key=celery

[tasks]

[2024-07-26 15:14:01,600: DEBUG/MainProcess] | Worker: Starting Hub

[2024-07-26 15:14:01,600: DEBUG/MainProcess] ^-- substep ok

[2024-07-26 15:14:01,600: DEBUG/MainProcess] | Worker: Starting Pool

container event container died

0 Replies

a year ago

What is your start command in use for celery?


sjpriest
HOBBY

a year ago

celery -A akashi worker --loglevel=DEBUG -E


a year ago

how many jobs do you think you could have running at the same time


sjpriest
HOBBY

a year ago

Not a lot… there are only 4 defined tasks. When I run it locally I use it with Celery Beat for cron tasks.


sjpriest
HOBBY

a year ago

AFAIK, it shoudn't be executing anything at all.


sjpriest
HOBBY

a year ago

I also added logger.info statements at the beginning of all shared tasks and none of those appear in the logs.


a year ago

you are likely running out of memory on the trial plan and thus your app is crashing, try this as the start command instead -

celery -A akashi worker --loglevel=DEBUG -E --concurrency 1

sjpriest
HOBBY

a year ago

Ok, will do


sjpriest
HOBBY

a year ago

Yup! That did it, thanks very much. GPT suggested that possibility but I foolishly ignored it because I didn't realize resources were limited like that on the trial.


a year ago

yep, 500mb of ram


sjpriest
HOBBY

a year ago

Ah, ok! Good to know.