2 months ago
My deployments used to work fine and fast. Now the build process hangs forever for no reason, I need your help to understand why.
[Region: asia-southeast1]
==============
Using Nixpacks
==============
context: 2c1d-qjtt
╔══════════════════════════════ Nixpacks v1.38.0 ══════════════════════════════╗
║ setup │ python311, gcc ║
║──────────────────────────────────────────────────────────────────────────────║
║ install │ python -m venv --copies /opt/venv && . /opt/venv/bin/activate ║
║ │ && pip install -r requirements.txt ║
║──────────────────────────────────────────────────────────────────────────────║
║ start │ python cvd_railway.py ║
╚══════════════════════════════════════════════════════════════════════════════╝
internal
load build definition from Dockerfile
0ms
internal
load metadata for ghcr.io/railwayapp/nixpacks:ubuntu-1745885067
535ms
internal
load .dockerignore
0ms
internal
load build context
0ms
stage-0
COPY .nixpacks/nixpkgs-bc8f8d1be58e8c8383e683a06e1e1e57893fff87.nix .nixpacks/nixpkgs-bc8f8d1be58e8c8383e683a06e1e1e57893fff87.nix
34ms
stage-0
RUN nix-env -if .nixpacks/nixpkgs-bc8f8d1be58e8c8383e683a06e1e1e57893fff87.nix && nix-collect-garbage -d
43s
23 store paths deleted, 245.31 MiB freed
stage-0
COPY . /app/.
425ms
stage-0
RUN python -m venv --copies /opt/venv && . /opt/venv/bin/activate && pip install -r requirements.txt
15s
Successfully installed blinker-1.9.0 certifi-2025.8.3 charset_normalizer-3.4.3 click-8.3.0 flask-3.1.2 idna-3.10 itsdangerous-2.2.0 jinja2-3.1.6 markupsafe-3.0.3 numpy-2.3.3 pandas-2.3.3 python-dateutil-2.9.0.post0 pytz-2025.2 requests-2.32.5 six-1.17.0 tzdata-2025.2 urllib3-2.5.0 websocket-client-1.8.0 werkzeug-3.1.3
stage-0
RUN printf '\nPATH=/opt/venv/bin:$PATH' >> /root/.profile
152ms
stage-0
COPY . /app
44ms
exporting to docker image format
4m 55s
exporting to image
4m 55s
auth
sharing credentials for production-asia-southeast1-eqsg3a.railway-registry.com
0ms
importing to docker
23 Replies
2 months ago
Hey there! We've found the following might help you get unblocked faster:
If you find the answer from one of these, please let us know by solving the thread!
2 months ago
actually it did something this time:
importing to docker
12s
=== Successfully Built! ===
Run:
docker run -it production-asia-southeast1-eqsg3a.railway-registry.com/4e4f9834-6eea-487e-ace9-63617fdc8ba9:1344c865-2b35-461d-9f6e-d649a2e081b0
Build time: 459.44 seconds
but other time, it can hang for more than 16 min, why so much deployment time difference?
2 months ago
before: importing to docker
9s
=== Successfully Built! ===
Run:
docker run -it production-asia-southeast1-eqsg3a.railway-registry.com/4e4f9834-6eea-487e-ace9-63617fdc8ba9:c7837b29-4b7d-4d7a-a460-fa0628b83076
Build time: 92.68 seconds
2 months ago
the code base is the same, please explain why it used to take 92.68 seconds, now it takes x4-5 more times.
2 months ago
Hi there, are you still running into this? This was likely a transient issue at the time — we've cycled our machines which should already help with this but please let us know if you still continue to run into this!
Status changed to Awaiting User Response Railway • 2 months ago
2 months ago
Yes I still encounter the same issue. Sometimes it deploys within 1 min and sometime it hangs forever and it will get deployed in more than 5-8min. Why?
Status changed to Awaiting Railway Response Railway • about 2 months ago
2 months ago
I need the deployment time to be constant for my system and within less than 3 min, do you think it's possible?
2 months ago
Hello,
This should be fixed now. We have made a few changes in our orchestrator to mitigate this.
Status changed to Awaiting User Response Railway • about 2 months ago
Status changed to Awaiting Railway Response Railway • about 2 months ago
Status changed to Awaiting User Response Railway • about 2 months ago
2 months ago
building the image is not the issue, it usually build super fast. The issue is in your system, check the blocking process.
Attachments
Status changed to Awaiting Railway Response Railway • about 2 months ago
2 months ago
Building the image is not the issue, that is correct, but building a smaller image can help.
Status changed to Awaiting User Response Railway • about 2 months ago
2 months ago
it's getting worse. please explain why this process needs so much time.
Attachments
Status changed to Awaiting Railway Response Railway • about 2 months ago
Status changed to Awaiting User Response Railway • about 2 months ago
2 months ago
this is a network issue guys! exporting to docker image format
10m 48s
importing to docker
7s
auth
sharing credentials for production-asia-southeast1-eqsg3a.railway-registry.com
Status changed to Awaiting Railway Response Railway • about 2 months ago
2 months ago
That step is the image upload process. Larger images take longer to upload, while a smaller image will upload faster and complete that step in less time.
Status changed to Awaiting User Response Railway • about 2 months ago
2 months ago
So please explain to me, yesterday it was building and uploading it within 2 min. And today, after no change in the code, it takes more than 10 min.
stage-0
COPY . /app
106ms
auth
sharing credentials for production-asia-southeast1-eqsg3a.railway-registry.com
0ms
importing to docker
10s
=== Successfully Built! ===
Run:
docker run -it production-asia-southeast1-eqsg3a.railway-registry.com/4e4f9834-6eea-487e-ace9-63617fdc8ba9:5de510bf-4d26-41dc-8e5a-f939cfa5d33e
Build time: 85.42 seconds
Status changed to Awaiting Railway Response Railway • about 2 months ago
2 months ago
Now it even times out: stage-0
RUN printf '\nPATH=/opt/venv/bin:$PATH' >> /root/.profile
137ms
stage-0
COPY . /app
21ms
importing to docker
8s
auth
sharing credentials for production-asia-southeast1-eqsg3a.railway-registry.com
0ms
Build timed out
Again, no change in the code.
2 months ago
Hello!
We're acknowledging your issue and attaching a ticket to this thread.
We don't have an ETA for it, but, our engineering team will take a look and you will be updated as we update the ticket.
Please reply to this thread if you have any questions!
2 months ago
🛠️ The ticket Deployment delays has been marked as todo.
a month ago
Now it's this stage that is randomly taking forever.
Attachments