a year ago
Hey, I am trying to create a celery service so that I can run tasks in the background like sending emails or notifications, I looked at the django guide, but I am just a little confused on how it actually works.
Since a lot of the guides I see online are celery integrated into the actual Flask application so then you can run the celery function/task straight from the code. Does anyone know how I'd be able to set it up properly?
54 Replies
a year ago
what specifically are you confused on that guide, id be happy to explain something you are stuck on in more depth
I've set up the Redis instance, and then made a celery_app.py
from celery import Celery
from dotenv import load_dotenv
import os
load_dotenv()
def make_celery(app_name=__name__):
return Celery(
app_name,
broker=os.environ['REDIS_URL'],
backend=os.environ['REDIS_URL']
)
celery = make_celery()a year ago
what command do you run locally to start celery?
a year ago
would I be correct in assuming you have celery running locally just fine?
well this code all it does is just run and then stops, so no errors but not like it stays open
would I also need to create another flask server to incorporate the celery service? but I guess that doesn't make sense
a year ago
nope, you'd have two services, one that runs only flask, and one that runs only celery
a year ago
but please get this working locally first, then once you got it working locally, I'll be here to help you run it on railway!
Now trying to put it onto railway, and im getting this:
invalid type: string "celery -A tasks worker --loglevel=info", expected a map
that's the start command I'm using, or should I use this one:
celery -A liftoff worker -l info --concurrency=3
But then what would liftoff be?
a year ago
What command do you use locally to run just celery
a year ago
instead of a procfile, set the command in the service settings

a year ago
yep
a year ago
you also need to attach a branch in the service settings

Also I noticed, when I push to github it's not pushing a new version onto railway with the /worker folder
a year ago
thats because you didnt have a branch connected up until a few minutes ago, that issue should be sorted out now
a year ago
when you run the celery command locally, do you do that from within the worker folder?
a year ago
can you delete all the __pycache__ folders from your repo and then make sure they are being gitignored
a year ago
so now the issue is that isnt a typical python project layout, despite there being python files, nixpacks doesnt know that you want it to run python
a year ago
so add a requirements.txt file with all the dependencies you need, and we will see if that helps
a year ago
folder structure is as printed in that screenshot, and there is no nixpacks.toml file
a year ago
sweet
a year ago
now, you will absolutely need to set the concurrency, the default values will use a lot of memory
a year ago
set it to the numder of jobs you could have running at the same time, if you will only ever run one job at any one time, set it to 1
would it be something like
celery -A app.celery_app worker --concurrency=1 --pool=prefork --loglevel=info --eventswhat would happen if i try to run a job back to back? does that count as just 1?
a year ago
as long as they run back to back you're fine, since thats still 1 job at any one point in time
I think it will do it one at a time, if you set the concurrency to one.
And it will do it depending on whats in the redis
a year ago
see the difference?

a year ago
awsome, any other questions?
a year ago
no problem, ill mark this as solved, but feel free to open another thread if you need further help!
a year ago
!s
Status changed to Solved brody • about 1 year ago




