Matching previous Django docker deployment not working

lesreaper
PRO

a year ago

Recently deployed docker setup with Django for a client this spring, and replicated this code:

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.postgresql_psycopg2",
        "NAME": os.getenv("POSTGRES_DB"),
        "USER": os.getenv("POSTGRES_USER"),
        "PASSWORD": os.getenv("POSTGRES_PASSWORD"),
        "HOST": os.getenv("PGHOST"),
        "PORT": os.getenv("PGPORT"),
    }
}

Worked phenomenal on for the last client.

BUT, I already had to adjust my CLI start command to this in order for the DB to be created, which is not on the working one:

python manage.py migrate && gunicorn mmooServer.wsgi --log-file -

Trying now to create the super user with railway run python manage.py createsuperuser and nothing but problems. Looked at this, and wouldn't work as I have a staging area and production, and two separate databases. On the last deployment, Railway figured out all of these connections with the railway link and railway run process.

This is my current error:
```

django.db.utils.OperationalError: could not translate host name "postgres-dev.railway.internal" to address: Name or service not known

```

I'm matching the docker-compose.yml, Dockerfile, and entrypoint.sh file EXACTLY, and even named the containers the same between the different client areas.

I'm kind of at a loss. I don't want to change the .env variables because I really didn't have to on the last deployment, and I would have to change them every time I wanted to create a super user, or other CLI commands, to development and then again to production . How do I manage this please?

Solved

5 Replies

a year ago

Hello,

You need to use the public host and port for the database when running commands locally.

You didnt need to previously because the public host and port was used by default, that is no longer the case because we found that too many users were foot gunning themselves and racking up large egress bills by having their application connect to the databases publicly.


Status changed to Awaiting User Response Railway 12 months ago


lesreaper
PRO

a year ago

So, that doesn't work for some reason.

I shouldn't have to touch my local docker install at all, correct? From what I understand, Railway shouldn't be touching my local running site at all that's running in a docker container, or wouldn't that be a security risk because I would have to add localhost to all my ALLOWED_ORIGINS for my Django site in development and production?

In any case, these are my variables on the development Django server:
```
POSTGRES_USER="${{Postgres-dev.POSTGRES_USER}}"

POSTGRES_PASSWORD="${{Postgres-dev.POSTGRES_PASSWORD}}"

POSTGRES_DB="${{Postgres-dev.POSTGRES_DB}}"

PGHOST="${{Postgres-dev.PGHOST}}"

PGPORT="${{Postgres-dev.PGPORT}}"

DATABASE_URL="${{Postgres-dev.DATABASE_URL}}"
```
What should they be if this isn't working?


Status changed to Awaiting Railway Response Railway 12 months ago


a year ago

You are currently using the private host and port for the database, you need to use the public host and port for the database when running commands locally.


Status changed to Awaiting User Response Railway 12 months ago


lesreaper
PRO

a year ago

I appreciate the feedback Brody, and I did sort of get this to work, but this is quite a mess I have to say compared to what it used to be.

For further clarification, to anyone using Postgres and Django, here is what you have to do to Create a Superuser for all your environments.

1. get the Public URL from your Django Settings link in your Railway portal. Should look something like: autorack.proxy.rlwy.net:36136
2. Make these changes to your Django site Variables, your Postgres instance Variables, AND you have to set up an entirely separate .env-xxxx for your local instance

PGHOST="autorack.proxy.rlwy.net"
PGPORT="36136"

3. You are going to have to MANUALLY adjust your docker-compose.yml file for EACH environment you are using on Railway, and add the PGPORT, PGHOST , `PGUSER`, etc. to you local environments.
You'll need to change this every time you want to run a CLI command:
```

    env_file:
      - .env-xxxx

4. Boot up your local Docker instance doing something like docker compose --env-file .env-xxxx up --build --force-recreate
5. Run railway link and then your railway run python manage.py createsuperuser

I was able to create a Super User I believe, but there's other errors I'm having now, which might be related to all of this, who knows. This is WAY more complicated than Railway using to be with a simple docker-compose.yml , and probably a bit slower having to user public IP addressing, but this should work for basic commands. I'll come back to this once I figure out how to get my actual Django app to appear in a Railway environment. I still can't get it to actually load even though I'm using the same exact Django project settings on a currently running project.

Logs are telling me (it does connect to PG now it seems)
```
container event container died

```


Status changed to Awaiting Railway Response Railway 12 months ago


a year ago

We have never supported docker-compose.

The private host and port is now default as to prevent users from footgunning themselves with egress costs.


Status changed to Awaiting User Response Railway 12 months ago


Railway
BOT

2 months ago

This thread has been marked as solved automatically due to a lack of recent activity. Please re-open this thread or create a new one if you require further assistance. Thank you!

Status changed to Solved Railway 3 months ago


Matching previous Django docker deployment not working - Railway Help Station