2 months ago
Hi Railway Support,
I have a backend service that's exhibiting a networking issue:
Service ID: 250cdb01-1559-46b3-8a39-6c1f4107c645
Project ID: e075d275-91d8-46c7-bc0b-f9adba36e74b
Issue:
- Container starts successfully (logs: "Uvicorn running on http://0.0.0.0:8080")
- HTTP requests arrive (visible in HTTP Logs tab)
- All requests return 502 Bad Gateway
- This happens with MINIMAL apps (15 lines of code, just FastAPI + 2 endpoints)
- Tested with both Dockerfile and Nixpacks builders - same result
This conclusively rules out application code issues. The container is running,
but Railway's load balancer/proxy cannot route traffic to it.
Can you please check:
1. Is there a networking configuration issue with this service?
2. Are there firewall rules or security groups blocking traffic?
3. Is the internal routing corrupted from previous failed deployments?
4. Container resource limits causing silent failures?
I'm happy to provide any additional logs or information needed.
2 Replies
2 months ago
You maybe try using the PORT env variable. https://docs.railway.com/guides/public-networking
if name == '__main__':
app.run(debug=True, port=os.getenv("PORT", default=5000))
You could also try reaching your FastAPI endpoints through local network. Deploy a new container, connect using CLI https://docs.railway.com/guides/cli and try to reach it using the private network endpoint.
2 months ago
Your logs show: Uvicorn running on http://0.0.0.0:8080 This means your application has hardcoded port 8080. Railway dynamically assigns a port to your container via the PORT environment variable. If Railway is sending traffic to port 3000 (or a random dynamic port), but your app is only listening on 8080, the load balancer returns a 502.
The fix:
- Update your code to listen on the port provided by the environment, rather than a hardcoded integer. port = int(os.environ.get("PORT", 8080))
- Or you can set the PORT in your service -> variables to 8080